Thursday, November 25, 2010

Computer Ethics = Important

Ethics is a very complex branch of philosophy that raises many difficult questions. Most moral problems are never completely black or white and can create confusion as to what the right thing to do is. People face many of moral dilemmas every day usually with little consequence to their actions. Professional ethics is usually more serious and can have serious impacts on a large number of people. Professionals have the responsibility of being extremely careful, honest, and knowledgeable about their area of expertise. Without proper work ethics people can be seriously injured or killed and businesses can lose vast amounts of money. The computer industry has many ethical guidelines that those working in it should follow, but like everything else many cases of moral decisions are grey areas that will have negative and positive consequences. For example, if you are a programmer working for a company that writes software for medical devices and you think there is an error in the code should you notify your boss? The answer might seem like an obvious YES, but what if everyone else thinks the code is fine and your complaint pushes back the release of the software which could have saved many lives if it was released on time? The point is that is sometimes difficult to do what is morally right even if you had good intentions. Computer professionals have the ethical obligation to make products that do what they are meant to do and are safe (e.g., the software should be accurate enough so that it doesn't cause a bridge to collapse). Computer professionals should also be open and honest about what the software is capable of and to be in contact with the client/ users to make sure they know how to use the program.


source: http://www.enduringamerica.com/storage/blog-post-images/THE%20THINKER.jpg?__SQUARESPACE_CACHEVERSION=1288076293112
Cold Hard Cash v.s. Doing The Right Thing... Tough Call

All industries want to cut costs and maximize profit and of course companies that rely on computer science are no different. The gaming industry is infamous for often rushing games out to buyers before enough testing has been done to make sure that game runs smoothly. Its impossible to guarantee that video games will never encounter glitches while being played and luckily their developers often frequently release free patches to fix common bugs. Some developers will work with gamers to help fix bugs by offering beta versions of the games to players and collect feedback. Glitches in video games can be annoying but at least this problem is not dangerous (except when someone gets enraged and puts their fist through their TV) but faulty code in other software (e.g., operating systems and websites) can lead to serious problems. Without proper testing of websites and other software there is the risk of the user having their personal information stolen or losing important data. The higher risk places a greater moral obligation on the company developing the software to make sure that it is as safe as possible even if it means losing money from stalling the release.



Writing faulty code is a very common problem amongst programmers. Although writing perfect code that functions perfectly every time isn't realistic, proper testing of software should be a main priority. Testing is especially important in potentially harmful situations where the software is used in medical equipment and other applications that can deal where the well being of people is at stake. The injuries and deaths caused by the Therac-25 could have been prevented if more responsibility was taken to ensure that the code would function safely. Obviously nobody intended to do harm to the patients using this equipment but it is still the moral obligation and responsibility of the company in charge to make sure that the machine would function safely.

Whos Fault is the Default?

Software should be designed to allow everyone to use it and be easily accessible. The default settings in most programs is set to be what the developers think the average user wants to use. The problem with this approach is that it causes discrimination against various groups of people (e.g., the elderly) who are often over looked by the development team. The defaults can be changed by going into the settings options in most programs, but it can be difficult to access these options for people who can't read the font, speak another language, aren't very computer savvy, etc. Default privacy settings are often set to a very low level of privacy on sites such as Facebook which outrages some people. Its obviously not easy to make everyone happy since everyone prefers to use different settings in most cases. The developers should have the ethical obligation to make the default settings designed for as many people as possible (not just the majority of people using it).  In cases where defaults can't accommodate everyone (e.g. languages) it should still be made easy to change the settings (i.e. not hiding options several layers deep in the settings menu).

                           

What To Do...

It is often difficult to decide what action to take in order to be an ethically responsible computer professional, but there are two phases to consider when deciding what action to take. The first phase is to brainstorm which involves listing and considering: stakeholders, risks, issues, consequences, and who gets benefited. The second phase is to analyze possible actions which can include: identifying responsibilities and the rights of the stakeholders, consider the impact on stakeholders, Use SE and ACM codes to see what is acceptable, arrive at the best possible conclusion. Make sure to ask for input from others if the situation is complex and there is doubt about what is ethically responsible to do.

Thursday, November 18, 2010

Hate the Coder/User Not the Computer

The marvels of computing power have transformed the way we live in so many ways and made our lives so much easier. So everything is fine and dandy in this technological age and computers are magically saving us from illness and having to think for ourselves.... or so it would seem. However even though computers can make life simpler and allow us to do complex tasks with ease they can also make life more dangerous on occasion. No matter how much testing software receives, it is almost impossible to fully guarantee that it will run correctly 100% of the time especially in elaborate programs that have millions of lines of code.

Son of a Glitch

The term "bug" apparently first originated from an actual moth getting caught inside a computer and frying it in 1947. Since then the term "bug" has been used to describe an error in a system (mainly computer software). Naturally at the earlier ages of computer science bugs were less common since the programs written were much smaller (mostly due to memory limitations). As programs get larger and larger and used in many devices that interact with the real world there are bound to be bigger problems. It is estimated that there is an average of 15-50 bugs per every 1000 lines of code. In other words the 100 million lines of code that it takes to run a modern car results in at least 1.5 million bugs that could cause the car to malfunction in some way or other. Of course most cars don't randomly crash or stall out of the blue but the potential is still there. These problems will become more dangerous in the future since the trend is to make products "smarter" and more automatic which gives more power to the computer and less to the human. More lines of code will be required for the increased complexity needed by smarter devices, so naturally more lines of code equals more bugs.



Previous Problems

The Therac-25 was used to give cancer patients radiation-therapy in 1985-1987. Unlike its predecessors (the Therac-6 and the Therac-20) it was completely automated and did not require human intervention (only to press a 'continue button' basically). This machine caused injury to many patients and killed three due to overdose. These deaths and injuries could have been prevented if the machines needed human intervention and had properly tested code. There were many design flaws, such as improper usage of a flagging variable and error messages that were difficult to understand. The programmers aren't entirely to blame though since the staff at the hospital should have received proper training to avoid accidents. Also it is almost impossible to be completely confident to ensure perfect safety in code of this nature without actually testing it in real world situations, something that the company who made the machine (Atomic Energy of Canada Limited) should have done.



                              Source: http://idg.bg/test/cwd/2008/7/14/21367-radiation_therapy.JPG

Besides the Therac-25 there are countless other examples of computer glitches causing injury, death, and money loss as well as other problems. A probe sent to Mars in 1998 that cost $125 million crashed before reaching its destination. This was caused by the usage of improper units (metric/ imperial). WWIII could have been started in 1983 when a glitch in soviet software designed to track missiles thought that the U.S. had fired five missiles. The Ariane 5 rocket that cost $500 million was destroyed when it used software designed for previous rockets. The older software was unable to calculate big enough numbers resulting in an overflow which of course made the rocket have extremely messed up calculations which rendered it useless. There are many other examples but I'll stop there to avoid boredom. The point is that small mistakes in code and common sense can have catastrophic consequences which need to be learned from and prevented from happening in the future.

                    

Now What?

Will computer bugs ever become a thing of the past? Probably not. After the scare of the Y2K bug there is already a new threat called the year 2038 problem. This is caused by time/date formats on most systems having the usage of 32-bits which will overflow on the 19th of January at 03:14:08 causing the date to go back to 1901. This isn't hard to see as being a minor problem that should be fixed on time. The real problems of the future are easy to see since the trend of technology is to make it more complex and to give it more power with less human control. Image recognition is one of the fastest growing computer based technologies, but image recognition is also one of the most difficult challenges to do successfully. This technology is being relied on to do everything from guiding robotic airplanes for the military to finding people likely to be terrorist at airports. These complex algorithms are getting better at doing their jobs but at the same time are being to heavily depended on which can easily lead to serious problems.

                               source: http://www.pinktentacle.com/images/face_recognition.jpg



In theory preventing these kinds of problems should be somewhat simple as long as the proper precautions are taken care of. It's up to the programmers and leaders of the companies to ensure that the programs written for different systems run the way they are supposed to. It is also very important that the users of the software are well trained to handle errors that can arise in real world situations. In the super competitive marketplace it is very difficult for companies to take the time and money required to properly test their products which results in buggy software being rushed to consumers. Even the most intelligent and well trained programmers are likely to overlook a possible coding flaw which will result in problems (sometimes even years later). Perhaps one way to help fix this problem is to give more products with computers in them the ability to download software patches online.

http://3.bp.blogspot.com/_OUmd4JXr2gk/SM8Z9lgzD0I/AAAAAAAAAC0/OIh2eIdZyxA/s320/softrit.jpg



Computers usually do more good than bad. They make mistakes all the time but normally the software does what it is intended to do. Even without automated systems there would still be tragedies due to human error. Doctors have accidentally overdosed patients and botched surgeries before the use of computers. Pilots and drivers have crashed their vehicles long before embedding any computer chips inside of them. Using a GPS is far less likely to get you lost than getting directions from another person (in most cases) even if they aren't always perfect. Computers can 'remember' things with perfect detail (like pictures) unlike the human mind that can make mistakes upon recalling information. Software will  probably never be completely perfect so more human control needs to be placed on potentially deadly equipment (such as vehicles and medical devices).

                        Source: http://softwarecreation.org/images/2007/computer-vs-human.jpg




Thursday, November 11, 2010

Long Live Technology!

As a computer nerd I don't understand how anyone could possibly have a problem with computers or other technology. Sure, I get how a lot of technology is harmful and dangerous but that doesn't mean that technology as a whole is a bad thing. Without advances in science and tech we wouldn't have anywhere near as many conveniences as we have today (such as TVs, video games, clothes with built in wifi detectors, and tons of other cool stuff). Sure those aren't necessary and they aren't useful to our survival but technology should be more attributed to the benefits it gives us and its ability to help us punch mother nature in the face and say "We want to live longer, better, and happier".  In this biased (and for good reason) post I will go on to talk about why we need technology and why the pros outweigh the cons. So all you Luddites out there (who shouldn't be reading this anyway) will get logically destroyed.

But Why????

Why do we need technology? Well you see, when a species is as weak and slow as ours there needs to be something special about it to allow for that species to survive. Us humans had a mutation millions of years ago that caused the powerful jaw muscles that other apes have to disappear. Those jaws muscles were located beside the skull which prevent brain growth. Therefore humans evolved much larger brains (mainly the cerebral cortex, which is used for higher intelligence such as logic). This weak species eventually learned how to harness the power of fire and how to craft weapons from sticks and rocks thanks to a powerful nervous system and an opposable thumb. This control over the environment caused more humans to be able to survive against lions and other predators (note: by "humans" I'm also referring to some species before homo sapiens). Now all of a sudden nature stopped dictating who would eat who based off of genetics alone. Technology combined with the brain was used to create new and better forms of technology (although primitive for a VERY long time). A species that was once driven to the edge of extinction was saved by they're ability to adapt and to change the world around them. Today predators don't pose much of a threat anymore but technology continues to help us live healthier and longer.




 Live Long and Prosper

After the second world war technology and scientific discovery made huge leaps and bounds forward. Advances in medical science raised life expectancy from an average of 30 (in ancient times) to about 70 (world wide). Diseases like polio that once killed countless people no longer is a threat due to widely available vaccines just like many other deadly diseases. Handicapped people have access to wheelchairs, electric scooters, and voice activated controls for their homes, something that would be impossible without the proper technology. One day people will be able to have their spinal cords repaired and be able to walk again after being paralyzed with perceived breakthroughs in stem cell research and other medical fields. We have every kind of vitamin supplement and the knowledge of what vitamins we need for what in order to be healthy all thanks to science and technology. With even more advances we will one day be able to cure cancer, make blind people see and deaf people hear, cure more diseases and live even longer. Technology skeptics and haters will say that all this power can just be used to do just as much damage (like bioterrorism) as it does good but so far that hasn't been the case. Even if bioterrorism gets as bad as some think it will, that is the result of people being evil and they shouldn't blame the tools which also save countless lives.



Get That Devil-Box Out of Here!




Computers are the tool of the devil according to some people (actually "nut jobs" is the proper term). The computer is used for evil, it degrades society, is used only for pornography and time waisting, etc... Anyone who really believes that has obviously never used a computer before in their lives (or they did and only used it for porn or looking up Fox News).  The computer has revolutionized society in a bigger way than arguably any technology that has ever existed before and its impacts are only going to get bigger. Sure there's a lot of useless parts of computing that are only for entertainment and time wasting, but the advantages of computing far outweigh its "evils". Computer's are used to simulate almost everything from car crashes to medical experiments to earthquake simulators. Without computing power our scientific knowledge and ability to invent or create anything would be at a snails pace compared to where we are at today. Not only do computers help with advances but they also create what some refer to as a "superorganism" that connects the world together into one system that constantly communicates and interacts. The ability to communicate and share news and information can happen in fractions of a second as opposed to several days by carrying letters on horse back. There are many in the world who can't afford and don't have access to computers and the Internet but at the rate we are going and with companies trying to narrow the divide between who can afford it soon everyone will have access to the power of computing.


Whoops!


So technology is all sunshine and rainbows and high definition right?

Not at all...

Technology has given us a lot of power and a lot of responsibility. The problem is that we aren't very responsible. The same mind that allowed us to survive our predators has now turned us into the world's most deadly predator that has driven many species to extinction or to endangerment. The energy sources that we love to use because they are cheap and give us the power we want is causing the planet to over heat, resulting in the melting of the ice caps and unnatural weather patterns that is wreaking havoc on the world. These and several other problems caused by technology haven't even reached their full impact yet but already show just how devastating it can be for a species to have so much power without enough judgement. The problem now is that we can't just stop using technology and we can't just say "OK enough, lets just keep whatever tech we have so it doesn't get worse". This progress we have can't be stopped it needs to keep going in order to solve the problems that were created before. This seems like a never ending cycle of causing a problem and then needing to fix it, but potentially mankind will one day be able to create new technologies that can save the world without any negative consequences by dramatically altering the way we do things.

We Can Do This ... Chill Out

Whats next for technology will be something unimaginable. The problem we have is that technology advances far faster than our mind evolves which means we can't easily predict the problems it will cause. But if we create a superintelligence that uses A.I. algorithms to constantly make itself smarter we could not only have the ability to advance technology at a far greater rate but also to keep our minds at the same rate as technology progresses. This will allow for us to solve problems faster than they occur. So if we were smarter back when coal and oil first started getting used we could have realised how damaging it would be to the environment and switched to an alternate energy sooner. Besides A.I. advances there are other upcoming technologies that can better the world. Nanotechnology is one of the most promising and will help with everything from clean energy sources to better medicines.


source: http://spectrum.ieee.org/images/public_html/automaton/ieee-spectrum-technological-singularity-thumb.png
Get to the Conclusion Already...

Technology will always be a double-edged sword as everyone always says. With technology we are able to live longer and better than we ever could without it. There are still many ways that it has harmed us and the world but eventually the benefits will further outweigh the consequences to it. Technology is what makes our species so special (and other things too I guess) and it will continue to evolve our society and our world to a better place. Besides giving us tons of cool (but useless) stuff, our knowledge of science gives us the ability to better understand who we are and what we are capable of and allows us to do things that used to be impossible. In time to come our knowledge will increase exponentially and so to will our technologies which will help us solve many problems that we can't solve in present times.

My name is Jordan, and I love technology.

Thursday, November 4, 2010

THEY TOOK OUR JOBS!!!!!!!!!

                

To get money you need to have a job (or to steal, count cards at black jack, etc.) but getting a job is not always easy especially when companies are always looking for ways to cut down on expenses. People always complain and blame the loss of their job on someone or something else. Whether people blame immigrants, offshore work, machines, computers, or robots there always seems to be some accused source of unemployment. Technology has always been a double-edged sword and always will be. Industries use technology to replace workers and to produce things more efficiently which saves them a lot of money. So technology is a good thing right? Not always. Workers obviously get the short end of the stick by receiving far less job opportunities. So with machines/computers constantly becoming more affordable and having the ability to do more and more jobs, is any job safe? This post will be all about jobs getting taken, that have been taken, and are going to get taken next.


No Job For You 


Ever since people with thick skulls were replaced by rocks to break things, there have been new technologies constantly becoming available to take people's jobs. Switchboard operators lost their jobs once better telephone technology came out, and then people who worked for newer phone companies were laid off as cell phone usage increased. When the automobile became commonly used in the 1920's it replaced the high demand for trains which resulted in lost jobs in the railway industry. The industrial revolution allowed for a high level of mass production by automating a lot of work in factories which meant that many workers were no longer needed. Machines continue to take new jobs from people but computers are the newer threat to the work force. Computer technology has replaced the need for several jobs that used to require highly trained professionals (like stock predictions and data collecting). Having programs that can 'think' and perform better than human workers is a huge benefit to companies but not so good for the previously required employees.



Age of Super A.I.



Computer programmers should be safe right? WRONG! Software is already capable of writing other software. Code that codes is still limited in today's world and requires humans to be able to write it in the first place and to monitor any unwanted results of such an algorithm. The day will come when artificial intelligence is able to reprogram itself into constantly superior algorithms that will give computers a rise to super intelligence. These hyper intelligent systems will of course be able to eventually write any kind of code far faster and far more accurately than hundreds of humans could do. You might think that humans would still be necessary to debug code since all programs have some bugs (and these super A.I.s are programs of course) and that might be true. Even with errors in the code the A.I. programs should be capable of debugging themselves (or future generations of A.I.) just like how the human brain has flaws but we still have neuro surgeons and psychologists who can fix those problems. Naturally we get into the sci-fi doomsday scenario where robots do everything and its cool for a while... until the jobs are all gone... and then they start a rebellion and enslave all of us.




Do Not Panic...


Don't worry it isn't all bad. When new a technology comes out and destroys jobs it also creates new ones. So when novelty hat makers get unemployed from the invention of a new novelty hat making machine then there will be jobs needed to repair/maintain and manufacture the machine. Board games have decreased in popularity since video games became mainstream and the video game industry hires people to program, write music, design art, manufacture, distribute, and sell the games. Vending machines don't really interfere with employment from retail stores and they create several jobs. The Internet is one of the greatest technological achievements of all time and it has created countless jobs and ways for people to make money (like eBay... or creative scams). There are of course behemoth Internet companies like Google, Facebook, and Amazon that are worth billions of dollars each and employ a HUGE amount of people. Technology may be destructive to some industries but it has also created tons of jobs that weren't even imagined before they existed.