Wednesday, June 28, 2017

wed Robot replacing Lawyers? Chatbot.

I remember way back, when I was really cool. Oh wait.  That wasn't me.

Thinking I may have like it all better before I could do so much of my work on the internet.

Donald Trump CHOOSES the color he is. It's not a natural color.  THAT'S how rich he is. Can it be long before we all are orange?... to stay in fashion?


50th reunion Silsbee 1962 class



Despite exposés and embarrassments, hundreds of judges preside in New York without law degrees

The news releases are sent out with considerable regularity, brief and basic accounts of actions taken by the New York State Commission on Judicial Conduct: A judge is sanctioned for misconduct on the bench; another agrees to give up their job because of questionable behavior in his or her private life.
Many of the announcements note that the judges, as part of their agreement with the commission, pledge to never seek or accept a job as a judge again. And some of the announcements include a fact that still packs a 21st century punch of surprise: The judges being disciplined are not, and never have been, lawyers.
Take, for instance, the announcement the commission issued today, June 26: “The New York State Commission on Judicial Conduct announced that Gary M. Poole, a Justice of the Rose Town Court, Wayne County, will resign from office effective July 1, 2017, and has agreed never to seek or accept judicial office at any time in the future.”
Poole consented to resign after the commission began investigating claims that he engaged in “repeated, undignified and discourteous conduct toward a woman with whom he had been involved romantically.”
Poole agreed to accept the commission’s action and signed a stipulation laying out the charges and results. He also waived any confidentiality protections and signed the stipulation knowing it would be made public.
“Among other things,” the commission’s announcement read, “the judge was alleged to have yelled demeaning and derogatory things about her and her new boyfriend in public, spuriously threatened her with prosecution, demanded the return of certain personal property and threatened to encourage her ex-husband to commence a custody battle over her children.”
And then the final line: “Judge Poole, who is not an attorney, has served as a Justice of the Rose Town Court since 1993.”
That some judges in New York state are not required to be lawyers, or to have any formal legal training, has been a little-understood fact for much of the last century. It has, on occasion, drawn some notice. In 2006, The New York Times published a broad and damning series on the work of what are known as town and village justices, some 2,000 or so of whom hold court in the state. It made for remarkable reading:
“Some of the courtrooms are not even courtrooms: tiny offices or basement rooms without a judge’s bench or jury box. Sometimes the public is not admitted, witnesses are not sworn to tell the truth, and there is no word-for-word record of the proceedings.
Nearly three-quarters of the judges are not lawyers, and many — truck drivers, sewer workers or laborers — have scant grasp of the most basic legal principles. Some never got through high school, and at least one went no further than grade school.
But serious things happen in these little rooms all over New York State. People have been sent to jail without a guilty plea or a trial, or tossed from their homes without a proper proceeding. In violation of the law, defendants have been refused lawyers, or sentenced to weeks in jail because they cannot pay a fine. Frightened women have been denied protection from abuse.
The examination found overwhelming evidence that decade after decade and up to this day, people have often been denied fundamental legal rights. Defendants have been jailed illegally. Others have been subjected to racial and sexual bigotry so explicit it seems to come from some other place and time. People have been denied the right to a trial, an impartial judge and the presumption of innocence …
The reporting by the Times provoked real and promised reforms. But what many felt was the core problem – not requiring the justices to be lawyers – remained unchanged.
And so the announcements still come from the judicial conduct committee.
June 22, 2017: A justice of the Rossie Town Court in St. Lawrence County resigned and agreed never to seek or accept judicial office at any time in the future after being accused of mishandling court funds and failing for years to file the dispositions of hundreds of cases over half a dozen years. The justice was not a lawyer.
June 21, 2017: A justice of the Spring Valley Village Court in Rockland County resigned from office, according to the commission, because his felony record disqualified him from being a judge. Despite a 1978 felony conviction, the judge had been appointed to fill a vacancy on the town bench after the prior judge had been removed from office by the commission.
“Judge Michel was ineligible to serve as a village justice in the first place because he is a convicted felon,” the commission said. “Under the circumstances, his departure from office was inevitable, and his agreement to do so sooner rather than be forced into it later was responsible.”
May 18, 2017: A justice in a town court in Broome County was ordered by the commission to be removed from office for trying to get his daughter’s traffic ticket fixed and improperly trying to influence the judge who was handling appeals of the justice’s decisions. The judge was not a lawyer.
The list goes on – a justice removed for drunk driving; another for physically abusing a colleague; another who, while not a lawyer himself, had nonetheless intervened in a friend’s case in another court by appearing as the friend’s lawyer.
The commission, first created in 1978, has responsibility for some 3,400 judges at all levels statewide. It handles close to 2,000 complaints a year, and, of course, any number of them can involve judges who are lawyers and who are handling cases in the state’s more professional courts.
Just last week, in fact, the commission announced the retirement of a judge working in state Supreme Court. He agreed never to seek or accept judicial office at any time in the future after it was revealed that he’d been accepting his six-figure salary despite never reporting to work for several years because of a health issue.
But the majority of the cases resulting in action involve the town and village judges. Marisa Harrison, the public records officer at the commission, said 70 of the cases resulting in discipline over the course of the commission’s existence dealt with such judges.
In 2006, the Times listed the explanations for the enduring existence of untrained judges in New York State:
“The powerful idea that communities should choose their own destinies, including their own judges. The considerable costs of updating courtrooms and hiring lawyers to preside. The always-popular calls to keep lawyers out of people’s lives. And, not least, the power of the justices, who are often important players in local politics, wired into the same party mechanisms that produce the state’s lawmakers, judges and governors.”
In an email exchange, Robert Tembeckjian, the commission’s administrator, said “the commission has advocated some reforms regarding the town and village courts.
“Two in particular that have come to pass are the recording of all proceedings — the court system has supplied every town and village court with laptops that have audio capability, and a rule of the Chief Administrative Judge requires all proceedings to be recorded and maintained, and more extensive ethics training for judges, which has resulted in the State Magistrates Association having commission staff make ethics presentations at all of their annual meetings and many of their regional meetings.”
Tembeckjian said the commission has not taken a position on whether to require all town and village justices to be lawyers, or whether to replace the current system with a full-time regional alternative such as a district court.
“One recent recommendation we have made that has not been implemented yet: a formal training and education program for town and village court clerks that would include accounting training, since maintenance of court fines and records is an important fiduciary responsibility of the town and village courts and the judges that is often delegated to the court clerks,” Tembeckjian said.
ProPublica is a Pulitzer Prize-winning investigative newsroom. Sign up for their newsletter.


Fifth Circuit Reverses $2 million Brady Violation Case Against Police Because Plaintiff Pleaded Guilty

The U.S. Fifth Circuit Court of Appeals has overturned a $2 million civil rights award an "actually innocent" plaintiff... READ MORE »


Charlie Kristine





Who’s to blame when self-driving cars crash?

On May 7, 2016, Joshua Brown made history. The Canton, Ohio, resident became the first person to die in a self-driving car.
Brown, 40, had turned on Autopilot, the autonomous driving system of his Tesla Model S, and set the cruise control at 74 miles per hour. As his car raced down a highway west of Williston, Florida, a tractor-trailer came out of an intersecting road.
Tesla’s Autopilot is a technological marvel. It controls the car, using radar and cameras to scan the road. It keeps the car within lanes on highways. It brakes, accelerates and passes other vehicles automatically.
According to one of Tesla’s public statements, the camera on Brown’s car failed to recognize the tractor-trailer crossing the highway against a bright sky. As a result, the car did not brake, nor did it issue any warning to Brown. The car crashed into the trailer, killing Brown.
The automobile’s self-driving system was not at fault, according to an investigation conducted by the National Highway Traffic Safety Administration. The agency found that Autopilot was designed to prevent Tesla cars from rear-ending other vehicles but was not intended to handle situations when vehicles crossed the road from intersecting roadways. Thus, there were no “defects in the design or performance” of the system, the NHTSA concluded.
Brown was responsible for the crash, according to the agency. If he were paying attention, he would have seen the truck crossing the highway and had at least seven seconds to respond—sufficient time to avoid the collision.
No one knows for sure what Brown was doing in the last seconds of his life. But the other driver told police he heard a Harry Potter movie playing in the crushed automobile after the crash.
Tesla avoided liability for his death because Autopilot was intended to aid, not replace, human drivers. The technology, however, is changing. Google, Mercedes-Benz, Tesla, Uber and Volvo are some of the companies working to develop fully autonomous cars, intended to drive themselves without human intervention. Google’s prototypes don’t have steering wheels or brake pedals.


Matthew T. Henshon, a partner at Henshon Klein in Boston and chair of the Artificial Intelligence and Robotics Committee of the ABA Section of Science and Technology Law, says “people haven’t really thought ... through” who—or what—will be liable when fully autonomous cars crash, resulting in injury or death.
“This is going to burgeon into the most significant subject matter of the 21st century,” says Paul F. Rafferty, a partner in the Irvine, California, office of Jones Day.
The law, as it stands now, is simple. Human beings cannot delegate driving responsibility to their cars. In self-driving cars, a human must be ready to override the system and take control.
This rule has to be updated, according to the NHTSA’s September 2016 report on autonomous vehicles. The organization suggested that different legal standards should apply, “based on whether the human operator or the automated system is primarily responsible for monitoring the driving environment.” For the latter type of vehicles—dubbed “highly automated vehicles”—the HAV system should be deemed the driver of the vehicle for purposes of state traffic laws, the NHTSA recommended. In other words, the HAV, not its passengers, should be criminally liable when traffic laws are violated.
There are good policy reasons for this, says Jeff Rabkin, a former prosecutor and now a partner in the San Francisco office of Jones Day. “If a passenger has no way to operate the vehicle, prosecuting the passenger would not serve any of the purposes of criminal law,” Rabkin says.
Similarly, it wouldn’t make a lot of sense to impose civil liability on the human occupants when the HAV has an accident. The NHTSA therefore has encouraged states to revise their tort laws to hold HAVs liable when there are crashes.


Holding an HAV accountable is easier said than done. “Multiple defendants are possible: the company that wrote the car’s software; the businesses that made the car’s components; the automobile manufacturer; maybe the fleet owner, if the car is part of a commercial service, such as Uber,” says Gary E. Marchant, director of the Center for Law, Science and Innovation at the Sandra Day O’Connor College of Law at Arizona State University. “How would you prove which aspect of the HAV failed and who should be the liable party?”
Technical forensic investigations will be required. “Attorneys will need to hire experts to download the black boxes from the vehicles and evaluate the precise system failure that caused the accident—a time-consuming process that will surely add additional expense to litigation,” says Jeffrey D. Wolf, a partner at Heimanson & Wolf in Los Angeles.
This will complicate criminal prosecutions of HAV companies—and transform civil accident cases. Relatively simple negligence suits that involve two parties will be replaced with complex, lengthy and expensive product liability litigation with multiple defendants.
“What have been, to date, mostly straightforward cases of fault against an owner for improper handling of a car will now become cases that are much more expensive,” Wolf says.
As a result, many tort victims will be unable to obtain justice. “It will be difficult to accommodate driverless vehicles under the current common-law framework. We will need a new statutory scheme because otherwise it will be too costly for individuals to prosecute [tort] claims,” says Wayne R. Cohen, founder and managing partner at Cohen & Cohen in Washington, D.C.
He favors a strict liability regime that covers HAV-makers and subcontractors. “Otherwise, you will impede access to the civil justice system for anyone who is injured,” Cohen says.
Other experts worry that a strict liability regime would put an unfair burden on manufacturers of HAVs. “There will be far fewer accidents with HAVs, but when they occur the vehicle’s manufacturer will be sued. So carmakers will have more liability than they do now for making a safer product,” Marchant says.


A strict liability regime could discourage companies from making HAVs. But public policy should encourage manufacturers of HAVs because studies have repeatedly concluded they’re far safer than human-driven cars.
A 2013 study by the Eno Center for Transportation (a nonprofit think-tank in Washington, D.C.), estimated that if 10 percent of the cars on U.S. roads were HAVs, 1,100 lives would be saved annually. If 90 percent of the cars were HAVs, 21,700 lives would be saved each year.
HAVs are expected to provide other benefits to society: easing traffic congestion; shortening travel time; burning less fuel; lowering emissions; and providing mobility to those who cannot drive, such as seniors and people with vision problems. If HAVs constituted 90 percent of cars on U.S. roads, the nation would save more than $355 billion per year, the Eno Center estimated.
Because a negligence standard might make it too expensive for crash victims to obtain justice and a strict liability standard might discourage companies from putting HAVs on the road, some people are contemplating other, less traditional methods for handling HAV tort liability. “Perhaps the creation of a no-fault system would be best, funded by buyers of autonomous vehicles or by a percentage of state motor vehicle fees,” Rafferty says.
A similar no-fault system was created to protect another socially beneficial product. “Vaccines made people safer, but there was great liability when something went wrong, so we had to change the liability regime,” Marchant says.
In 1986, Congress required vaccine-makers—in exchange for legal protections—to contribute to a no-fault compensation fund. Sufferers go before special vaccine courts that can’t award punitive dam-ages, only compensation. “They may need to set up something like that for driverless cars,” Marchant says.
Some experts warn it’d be premature to enact laws now. “Legislation changes slowly, and technology changes fast. Legislation can become obsolete very quickly,” Rabkin says.


The Venezuelan authorities inspected the area around the Supreme Court in Caracas on Tuesday after a police faction flew a helicopter over the court headquarters, dropping grenades and firing into the building, according to government officials.
Police Officers in Helicopter Attack Venezuela's Supreme Court


The attack, with grenades, was a rare act of disloyalty against a government under pressure from protests and an economic collapse.


Chatbot apps help users communicate their legal needs

Photo of Victor Li by Saverio Truglia
When I was at the ABA Techshow this year, a prevalent question on people’s minds seemed to be: “Are the robot lawyers coming?”
In some ways, they’re already here.
Take chatbots. For most people, chatbots are what you see when you go to a website and an automated program starts asking whether you wish to chat. Using natural language, chatbots can simulate human conversation, giving the user the impression that they are talking with an actual person instead of with artificial intelligence. Chatbots are already being used in a variety of ways, including addressing customer needs, educating children, providing investment advice and even debating the meaning of life.
And chatbots can help people with their legal needs. During the Tech for Justice Hackathon plus Veterans event that took place in March during Techshow, first place went to Carry On, a tool designed to help victims of military sexual trauma. An important part of Carry On is Coralie, a chatbot that connects users to resources or services to allow them to report an incident, navigate the justice system or get help.
“If I need help right now and a human isn’t available, that’s a problem, given the world we live in today, as well as our technological capabilities,” said Christy Leos, director of operations at the, who helped design the chatbot portion of Carry On.


In 2016, Joshua Browder became an instant sensation when his legal chatbot, DoNotPay,overturned nearly 160,000 parking tickets on behalf of users in the United Kingdom and the United States.
A computer science student at Stanford University, Browder designed the bot because he is, in his words, a terrible driver. “I got a bunch of tickets, and when I went to appeal them I found that I was copying the same text over and over,” says Browder, who claims that DoNotPay had successfully overturned 245,000 tickets in the U.K. and U.S. as of March.
That month, Browder announced he had expanded his chatbot’s capabilities to help refugees in Canada, the U.S. and the U.K. claim asylum. The plight of refugees hit home for Browder, whose grandmother fled the Nazis, as Brexit and President Donald Trump’s polices have underscored the need for such a tool.
Whereas he had only needed a couple of months during his summer vacation to set up the parking ticket bot, the immigration bot was more complex and took nearly a year to build. Browder says he consulted with immigration lawyers in all three countries, and the chatbot asks users a few questions and then auto-populates the necessary forms for them.
“The benefit of a chatbot comes from the fact that many people are really terrible at describing their legal problems,” says Browder. “There are lots of ways to do it but only one legal way. The chatbot can translate human input into legally correct input.”
The new version of DoNotPay is on Facebook Messenger. Browder decided to use the social network’s chat interface because of its reach (more than 1 billion users) and accessibility.


Facebook has become one of the go-to places for chatbots, boasting more than 30,000 as of September 2016.
California lawyer Tom Martin, who created LawDroid for Facebook Messenger, points out that Apple hasn’t really rolled out the welcome mat for chatbot apps yet.
“Facebook provides detailed demographic information in terms of who is using [LawDroid],” says Martin, who launched the app in November to help individuals generate California business incorporation documents. “I can then tailor my marketing efforts based on that information.”
The technology isn’t perfect. Some chatbots have become vehicles for scammers. Last July, for instance, Tinder users were swindled by a chatbot phishing for personal information by impersonating potential matches. Others, including Joshua Lenon, lawyer-in-residence at Clio, have argued that chatbots can limit access to justice by telling users that they have no case or recourse.
Nevertheless, Browder and Martin think the sky is the limit. Martin points to Amazon’s Alexa and thinks chatbots with voice capabilities could eventually perform a wide range of legal services. Browder, meanwhile, thinks that chatbots could help facilitate cooperation between the law and other disciplines, particularly medicine.
“Lawyers are confined by their law degrees,” Browder says. “But a chatbot doesn’t have to stop between industries. They can diagnose medical or psychological illnesses and also help users get legal help.”

Jason Van Dyke, in court last month, is charged with murder in the shooting of Laquan McDonald. Three fellow officers who backed up his account of the episode are accused of trying to protect him from being investigated and charged.
3 Chicago Officers Charged With Conspiracy in Laquan McDonald Case


They are accused of covering up for Jason Van Dyke, a white officer who fatally shot Mr. McDonald, a black teenager, in 2014.


No comments:

Subscribe via email

Enter your email address:

Delivered by FeedBurner