In a unanimous decision released Friday afternoon, the state Supreme Court threw out the 1989 murder convictions of two New Milford men and delivered a stinging rebuke to renowned forensic science expert Henry Lee, whose inaccurate testimony put them in prison for decades.
Both Sean Henning and Ralph Birch were convicted in separate trials for the bloody murder of Everett Carr, who was stabbed 47 times, his throat slit and his blood tracked through the house. They were convicted partially based on the testimony of Lee who told jurors that a towel in the bathroom of Carr’s home had a spot on it that he had tested and found was “consistent with blood.”
Tuesday 16 July 201912:00PM-1:00PM
Technology & Criminal Law Committee Minor CLE
The Technology Committee & The Criminal Law Committee present:
Digital Forensics Brief For Attorneys
Description: Most attorneys are not aware of how digital artifacts can play a critical role in their cases, as digital forensics dives much deeper than litigation support and electronic discovery. This 1 HR talk will give attendees a better understanding about how digital artifacts are used to tell a story in litigation and criminal cases.
Instructors of this course are seasoned digital forensics experts who have testified in state and federal courts and worked hundreds of cases. Upon completion, attendees will be prepared to contact a digital forensics expert, know the answers to some of the most commonly asked questions, and have an expanded perspective for how digital analysis can affect their cases.
Location: OCBA Center – 880 N. Orange Ave
Program: 12:00pm – 1:00pm
CLE: 1.0 General & 1.0 Technology CLE Credit
- Aaron Weiss
- Santiago Ayala
Sponsored by: Hex 21 Group
Registration deadline: July 14th 2019
To Register visit the OCBA Store by Clicking Here
If you have any questions or need help registering, please contact Ashley Norris at firstname.lastname@example.org or 407-422-4551 ext. 233.
***If you are an OCBA Member please make sure you login to your account before registering for any events/seminars to receive your member discount if applicable***
A government watchdog says the FBI has access to about 640 million photographs — including from driver’s licenses, passports and mugshots — that can be searched using facial recognition technology.
The figure reflects how the technology is becoming an increasingly powerful law enforcement tool, but is also stirring fears about the potential for authorities to intrude on the lives of Americans. It was reported by the Government Accountability Office at a congressional hearing in which both Democrats and Republicans raised questions about the use of the technology.
Since April 2018, when police announced they had apprehended Joseph DeAngelo, the man they alleged to be the long-elusive Golden State Killer, the floodgates have opened.
The key insight responsible for DeAngelo’s arrest came courtesy of a then-little-known forensic technique known as genetic genealogy: a method in which investigators try to link crime scene DNA to DNA from biological relatives in the hopes of generating leads for identifying suspects or remains. The science behind the technique has been around for a while. Yet the real potential to get hits in these searches has only been made possible by the recent advent of online, easily accessible DNA databases like GEDmatch (where police got a match for a distant relative of DeAngelo’s) and FamilyTreeDNA—sites that now boast more than 1 million user profiles each. Many of these come from individuals who uploaded their own genetic data from popular consumer DNA testing kits like 23andMe and AncestryDNA.
A Democratic senator sent a letter to Amazon CEO Jeff Bezos Thursday requesting information about why the company retains transcripts of conversations recorded by Amazon Echo devices, even after users have pressed “delete.”
Amazon’s voice-controlled operating system Alexa transcribes the conversations it picks up after users say a “wake word” — “Alexa,” “Echo,” “Amazon” or “computer” — or press a button to enable the Echo, according to a report by CNET. And the company saves those text files on its servers even after users opt to “delete” the audio files from the cloud, a CNET investigation revealed.
A Central Florida company has been awarded a $225,000 grant to develop technology to take the human factor out of the field tests law-enforcement officers use to identify illegal drugs and make arrests.
IDEM LLC, a client of the UCF Business Incubation Program, received the grant from the National Science Foundation.
A sophisticated voice-identity technology that monitors inmates on prison telephones has been installed in at least 23 Florida counties and has been used to bring criminal charges against inmates in at least one of them, a Fresh Take Florida investigation found.
The technology produced by a secretive, Dallas-based company is designed to make and store voice prints of inmates and to ensure they are using prison phones under their own identities, rather than secretly making calls using the IDs of other inmates.
San Francisco, long one of the most tech-friendly and tech-savvy cities in the world, is now the first in the United States to prohibit its government from using facial-recognition technology.The ban is part of a broader anti-surveillance ordinance that the city’s Board of Supervisors approved on Tuesday. The ordinance, which outlaws the use of facial-recognition technology by police and other government departments, could also spur other local governments to take similar action. Eight of the board’s 11 supervisors voted in favor of it; one voted against it, and two who support it were absent.
Law enforcement agencies are increasingly using predictive policing systems to forecast criminal activity and allocate police resources. Yet in numerous jurisdictions, these systems are built on data produced during documented periods of flawed, racially biased, and sometimes unlawful practices and policies (“dirty policing”). These policing practices and policies shape the environment and the methodology by which data is created, which raises the risk of creating inaccurate, skewed, or systemically biased data (“dirty data”). If predictive policing systems are informed by such data, they cannot escape the legacies of the unlawful or biased policing practices that they are built on. Nor do current claims by predictive policing vendors provide sufficient assurances that their systems adequately mitigate or segregate this data. In our research, we analyze thirteen jurisdictions that have used or developed predictive policing tools while under government commission investigations or federal court monitored settlements, consent decrees, or memoranda of agreement stemming from corrupt, racially biased, or otherwise illegal policing practices. In particular, we examine the link between unlawful and biased police practices and the data available to train or implement these systems. We highlight three case studies: (1) Chicago, an example of where dirty data was ingested directly into the city’s predictive system; (2) New Orleans, an example where the extensive evidence of dirty policing practices and recent litigation suggests an extremely high risk that dirty data was or could be used in predictive policing; and (3) Maricopa County, where despite extensive evidence of dirty policing practices, a lack of public transparency about the details of various predictive policing systems restricts a proper assessment of the risks. The implications of these findings have widespread ramifications for predictive policing writ large. Deploying predictive policing systems in jurisdictions with extensive histories of unlawful police practices presents elevated risks that dirty data will lead to flawed or unlawful predictions, which in turn risk perpetuating additional harm via feedback loops throughout the criminal justice system. The use of predictive policing must be treated with high levels of caution and mechanisms for the public to know, assess, and reject such systems are imperative.
The tech giant records people’s locations worldwide. Now, investigators are using it to find suspects and witnesses near crimes, running the risk of snaring the innocent.
Melissa Morales was riding her bicycle near the Flamingo Diner just off of U.S. Highway 1 in Stuart, Florida, when she was stopped by a Martin County sheriff’s deputy. It was 10 p.m. but still warm on an evening in late October 2018, and Deputy Steven O’Leary told Morales he stopped her because her bike had no lights.
Morales apologized and promised to get lights, but O’Leary decided to search her purse regardless. Inside, he found what he described as a “white, rocklike substance.” He then ran a field test that he said yielded a positive result for methamphetamine. The 37-year-old Floridian told O’Leary that what he claimed was meth was “just a rock.”
At 9:00 a.m. last December 14, a man in Orange County, California, discovered he’d been robbed. Someone had swiped his Volkswagen Golf, his MacBook Air and some headphones. The police arrived and did something that is increasingly a part of everyday crime fighting: They swabbed the crime scene for DNA.
Normally, you might think of DNA as the province solely of high-profile crimes—like murder investigations, where a single hair or drop of blood cracks a devilish case. Nope: These days, even local cops are wielding it to solve ho-hum burglaries. The police sent the swabs to the county crime lab and ran them through a beige, photocopier-size “rapid DNA” machine, a relatively inexpensive piece of equipment affordable even by smaller police forces. Within minutes, it produced a match to a local man who’d been previously convicted of identity theft and burglary. They had their suspect.
POLICE FOUND 19 spent shell casings scattered in the San Diego street where Gregory Benton was murdered on April 12, 2014. Benton and his cousin had gone to buy cigarettes, a witness later said. As they returned to a family party, two men pulled up in a car behind them. They got out, and at least one of them opened fire.
Witnesses didn’t get a good look at the men or the car, so when police sat down to review their leads, the shell casings were the best evidence they had. They sent the casings to the San Diego Police Crime Lab, which just happened to be trying out a new DNA testing technique.
One night in November 1999, a 26-year-old woman was raped in a parking lot in Grand Rapids, Mich. Police officers managed to get the perpetrator’s DNA from a semen sample, but it matched no one in their databases.
Detectives found no fingerprints at the scene and located no witnesses. The woman, who had been attacked from behind, could not offer a description. It looked like the rapist would never be found.
The Orlando Police Department is seeking state funding to buy a new DNA-testing technology that would allow officers to test and compare evidence in less than two hours, without shipping it to a state lab.
In its pitch for $250,000 to Orange County’s legislative delegation, the agency said the technology, known as Rapid DNA, “has the potential to change the paradigm for law enforcement and its capacity to solve cases,” including by identifying suspects or linking crime scenes.
Court records and FBI Lab files show statements by prosecutors or Richard Vorder Bruegge, the most prominent member of the Forensic Audio, Video and Image Analysis Unit, veered from his original conclusions in at least three cases.
March 5, 2019 @ 2:00 p.m. Eastern, 60 minutes
This session will focus on helping your client navigate the landscape of DNA dystopia. Learn about DNA collection of minors; expansion of rogue databases; spread of Rapid DNA; the mining of genealogical databases by law enforcement; and the linking of forensic databases with genealogical ones. Attend this session and learn how to fight these threats to your clients’ privacy rights in the age of genetic surveillance.
On the night of March 16, 2017, the city of Raleigh, North Carolina suffered its biggest fire in a century. The flames scorched 10 buildings, including churches and businesses. A seven-story apartment complex, then under construction, was reduced to ashes. The fire ultimately caused $50 million in damages.
Over the next year, authorities investigated the fire but seemed to struggle to determine its cause. According to a report by local NBC affiliate WRAL, the Raleigh police went to extreme lengths to find out if an arsonist may have set the blaze. Investigators served a search warrant to Google, asking that the company to provide the coordinates of any phones that were in the area between 7:30 p.m. and 10:30 p.m. on the night of the fire. It was likely for naught—police ended up classifying the cause of the fire as “undetermined.”
The use of PredPol—a predictive policing software that once advocated for a controversial, unproven “broken windows” approach to law enforcement—is far more widespread than previously reported, according to documents obtained by Motherboard using public records requests.
PredPol claims to use an algorithm to predict crime in specific 500-foot by 500-foot sections of a city, so that police can patrol or surveil specific areas more heavily.
ROUGHLY SIX MONTHS ago at New York’s Sing Sing prison, John Dukes says he was brought out with cellmates to meet a corrections counselor. He recalls her giving him a paper with some phrases and offering him a strange choice: He could go up to the phone and utter the phrases that an automated voice would ask him to read, or he could choose not to and lose his phone access altogether.
Dukes did not know why he was being asked to make this decision, but he felt troubled as he heard other men ahead of him speaking into the phone and repeating certain phrases from the sheets the counselors had given them.