After more than a year of testing Amazon’s high-tech facial recognition software, the city of Orlando announced Thursday it will not continue the program, citing a lack of resources needed to continue testing, a memo sent to city council members Thursday shows.
The letter, sent from Orlando’s Chief Administrative Officer Kevin Edmonds, police Chief Orlando Rolón and Chief Information Officer Rosa Akhtarkhavari, said the city is ending use the Amazon’s face-matching software, Rekognition, because it “was not able to dedicate the resources to the pilot to enable us to make any noticeable progress toward completing the needed configuration and testing.”
It started as a way to trace family history. It evolved into a tool to help solve decades-old cold cases. Now, for apparently the first time, a genealogy database is expected to lead to charges being dropped against an Idaho man convicted in a decades-old rape and murder case.
There is “clear and convincing evidence” that Christopher Tapp, who served 20 years in prison, was wrongfully convicted in the 1996 killing of 18-year-old Angie Dodge, Bonneville County Prosecutor Daniel Clark wrote in a court filing last week.
These conferences will feature nationally recognized faculty in a mix of plenary sessions, simultaneous sessions and small group breakouts. Participants will have the opportunity to choose sessions to best fit their individual needs. There will also be networking opportunities to create relationships to sustain the support provided during the live event.Co-hosted by the Washington Defender Association, the Innocence Project Northwest and the University of Washington School of Law.
Immigration and Customs Enforcement and the FBI are among 17 federal agencies that have access to every Florida driver’s license through a massive facial recognition network, records obtained by the Orlando Sentinel show.
The network, called Face Analysis Comparison & Examination System (FACES), is maintained by the Pinellas County Sheriff’s Office and accessed by 273 “partner agencies,” including Customs and Border Protection and the IRS, as part of an exhaustive push from police agencies to use facial recognition as a law-enforcement tool.
San Francisco police officials gathered a room of reporters at department headquarters almost a year ago to make a stunning announcement: They had used DNA evidence to identify and jail an alleged serial sexual predator dubbed the “Rideshare Rapist” who terrorized women for years while posing as a driver for a ride-hailing service.
The arrest intensified the focus on rider safety in the emerging app-based industry. Immigration officials seized on the case, pointing out that the suspect, 38-year-old Orlando Vilchez Lazo, was in the country illegally from Peru. Meanwhile, revelations that Vilchez Lazo had worked for Lyft raised questions about the company’s background checks.
The largest manufacturer of police body cameras is rejecting the possibility of selling facial recognition technology – at least, for now.
Axon, formerly known as Taser International, has worked with more than 18,000 law enforcement agencies worldwide, selling a suite of products that include body cameras and software. It says 48 of 79 major city law enforcement agencies in North America are Axon customers.
A scandal of falsified drug arrests is spreading at a Florida sheriff’s office that has also spent more than $1.33 million settling excessive force lawsuits and is at the center of the increasingly troubled Robert Kraft case.
Ariella Russcol specializes in drama at the Frank Sinatra School of the Arts in Queens, New York, and the senior’s performance on this April afternoon didn’t disappoint. While the library is normally the quietest room in the school, her ear-piercing screams sounded more like a horror movie than study hall. But they weren’t enough to set off a small microphone in the ceiling that was supposed to detect aggression.
In a unanimous decision released Friday afternoon, the state Supreme Court threw out the 1989 murder convictions of two New Milford men and delivered a stinging rebuke to renowned forensic science expert Henry Lee, whose inaccurate testimony put them in prison for decades.
Both Sean Henning and Ralph Birch were convicted in separate trials for the bloody murder of Everett Carr, who was stabbed 47 times, his throat slit and his blood tracked through the house. They were convicted partially based on the testimony of Lee who told jurors that a towel in the bathroom of Carr’s home had a spot on it that he had tested and found was “consistent with blood.”
Tuesday 16 July 201912:00PM-1:00PM
Technology & Criminal Law Committee Minor CLE
The Technology Committee & The Criminal Law Committee present:
Digital Forensics Brief For Attorneys
Description: Most attorneys are not aware of how digital artifacts can play a critical role in their cases, as digital forensics dives much deeper than litigation support and electronic discovery. This 1 HR talk will give attendees a better understanding about how digital artifacts are used to tell a story in litigation and criminal cases.
Instructors of this course are seasoned digital forensics experts who have testified in state and federal courts and worked hundreds of cases. Upon completion, attendees will be prepared to contact a digital forensics expert, know the answers to some of the most commonly asked questions, and have an expanded perspective for how digital analysis can affect their cases.
Location: OCBA Center – 880 N. Orange Ave
Program: 12:00pm – 1:00pm
CLE: 1.0 General & 1.0 Technology CLE Credit
- Aaron Weiss
- Santiago Ayala
Sponsored by: Hex 21 Group
Registration deadline: July 14th 2019
To Register visit the OCBA Store by Clicking Here
If you have any questions or need help registering, please contact Ashley Norris at firstname.lastname@example.org or 407-422-4551 ext. 233.
***If you are an OCBA Member please make sure you login to your account before registering for any events/seminars to receive your member discount if applicable***
A government watchdog says the FBI has access to about 640 million photographs — including from driver’s licenses, passports and mugshots — that can be searched using facial recognition technology.
The figure reflects how the technology is becoming an increasingly powerful law enforcement tool, but is also stirring fears about the potential for authorities to intrude on the lives of Americans. It was reported by the Government Accountability Office at a congressional hearing in which both Democrats and Republicans raised questions about the use of the technology.
Since April 2018, when police announced they had apprehended Joseph DeAngelo, the man they alleged to be the long-elusive Golden State Killer, the floodgates have opened.
The key insight responsible for DeAngelo’s arrest came courtesy of a then-little-known forensic technique known as genetic genealogy: a method in which investigators try to link crime scene DNA to DNA from biological relatives in the hopes of generating leads for identifying suspects or remains. The science behind the technique has been around for a while. Yet the real potential to get hits in these searches has only been made possible by the recent advent of online, easily accessible DNA databases like GEDmatch (where police got a match for a distant relative of DeAngelo’s) and FamilyTreeDNA—sites that now boast more than 1 million user profiles each. Many of these come from individuals who uploaded their own genetic data from popular consumer DNA testing kits like 23andMe and AncestryDNA.
A Democratic senator sent a letter to Amazon CEO Jeff Bezos Thursday requesting information about why the company retains transcripts of conversations recorded by Amazon Echo devices, even after users have pressed “delete.”
Amazon’s voice-controlled operating system Alexa transcribes the conversations it picks up after users say a “wake word” — “Alexa,” “Echo,” “Amazon” or “computer” — or press a button to enable the Echo, according to a report by CNET. And the company saves those text files on its servers even after users opt to “delete” the audio files from the cloud, a CNET investigation revealed.
A Central Florida company has been awarded a $225,000 grant to develop technology to take the human factor out of the field tests law-enforcement officers use to identify illegal drugs and make arrests.
IDEM LLC, a client of the UCF Business Incubation Program, received the grant from the National Science Foundation.
A sophisticated voice-identity technology that monitors inmates on prison telephones has been installed in at least 23 Florida counties and has been used to bring criminal charges against inmates in at least one of them, a Fresh Take Florida investigation found.
The technology produced by a secretive, Dallas-based company is designed to make and store voice prints of inmates and to ensure they are using prison phones under their own identities, rather than secretly making calls using the IDs of other inmates.
San Francisco, long one of the most tech-friendly and tech-savvy cities in the world, is now the first in the United States to prohibit its government from using facial-recognition technology.The ban is part of a broader anti-surveillance ordinance that the city’s Board of Supervisors approved on Tuesday. The ordinance, which outlaws the use of facial-recognition technology by police and other government departments, could also spur other local governments to take similar action. Eight of the board’s 11 supervisors voted in favor of it; one voted against it, and two who support it were absent.
Law enforcement agencies are increasingly using predictive policing systems to forecast criminal activity and allocate police resources. Yet in numerous jurisdictions, these systems are built on data produced during documented periods of flawed, racially biased, and sometimes unlawful practices and policies (“dirty policing”). These policing practices and policies shape the environment and the methodology by which data is created, which raises the risk of creating inaccurate, skewed, or systemically biased data (“dirty data”). If predictive policing systems are informed by such data, they cannot escape the legacies of the unlawful or biased policing practices that they are built on. Nor do current claims by predictive policing vendors provide sufficient assurances that their systems adequately mitigate or segregate this data. In our research, we analyze thirteen jurisdictions that have used or developed predictive policing tools while under government commission investigations or federal court monitored settlements, consent decrees, or memoranda of agreement stemming from corrupt, racially biased, or otherwise illegal policing practices. In particular, we examine the link between unlawful and biased police practices and the data available to train or implement these systems. We highlight three case studies: (1) Chicago, an example of where dirty data was ingested directly into the city’s predictive system; (2) New Orleans, an example where the extensive evidence of dirty policing practices and recent litigation suggests an extremely high risk that dirty data was or could be used in predictive policing; and (3) Maricopa County, where despite extensive evidence of dirty policing practices, a lack of public transparency about the details of various predictive policing systems restricts a proper assessment of the risks. The implications of these findings have widespread ramifications for predictive policing writ large. Deploying predictive policing systems in jurisdictions with extensive histories of unlawful police practices presents elevated risks that dirty data will lead to flawed or unlawful predictions, which in turn risk perpetuating additional harm via feedback loops throughout the criminal justice system. The use of predictive policing must be treated with high levels of caution and mechanisms for the public to know, assess, and reject such systems are imperative.
The tech giant records people’s locations worldwide. Now, investigators are using it to find suspects and witnesses near crimes, running the risk of snaring the innocent.
Melissa Morales was riding her bicycle near the Flamingo Diner just off of U.S. Highway 1 in Stuart, Florida, when she was stopped by a Martin County sheriff’s deputy. It was 10 p.m. but still warm on an evening in late October 2018, and Deputy Steven O’Leary told Morales he stopped her because her bike had no lights.
Morales apologized and promised to get lights, but O’Leary decided to search her purse regardless. Inside, he found what he described as a “white, rocklike substance.” He then ran a field test that he said yielded a positive result for methamphetamine. The 37-year-old Floridian told O’Leary that what he claimed was meth was “just a rock.”
At 9:00 a.m. last December 14, a man in Orange County, California, discovered he’d been robbed. Someone had swiped his Volkswagen Golf, his MacBook Air and some headphones. The police arrived and did something that is increasingly a part of everyday crime fighting: They swabbed the crime scene for DNA.
Normally, you might think of DNA as the province solely of high-profile crimes—like murder investigations, where a single hair or drop of blood cracks a devilish case. Nope: These days, even local cops are wielding it to solve ho-hum burglaries. The police sent the swabs to the county crime lab and ran them through a beige, photocopier-size “rapid DNA” machine, a relatively inexpensive piece of equipment affordable even by smaller police forces. Within minutes, it produced a match to a local man who’d been previously convicted of identity theft and burglary. They had their suspect.