Ethics in the workplace
What I learned
-
How AI is developed and used will have a significant impact on society for many years to come,” Pichai wrote. “These are not theoretical concepts; they are concrete standards that will actively govern our research and product development and will impact our business decisions.
-
Google is committing to not using artificial intelligence for weapons or surveillance.
-
Employees at Google were concerned and protested about AI being used in weaponry shows how much we evolved in technology.
-
Even If google is saying they wont use AI for weaponry, that does not mean that other huge companies wont.
-
Google will have to offer more public transparency as to the systems they build.
Ethics in Technology
What I learned
-
Self driving cars ethics is about who the car’s AI choose to hit and probably kill when forced into very difficult situations.
-
The ethics of self driving cars have been controversial since few years if not a decade.
-
There is evidence that people are worried about the choices self-driving cars will be programmed to take.
-
Automakers and suppliers largely downplay the risks of what in philosophical circles is known as “the trolley problem” — named for a no-win hypothetical situation in which, in the original format, a person witnessing a runaway trolley could allow it to hit several people or, by pulling a lever, divert it, killing someone else.
-
While some people in the industry, like Tesla’s Elon Musk, believe fully autonomous vehicles could be on U.S. roads within a few years, others say it could be a decade or more — and even longer before the full promise of self-driving cars and trucks is realized.