Wednesday, April 21, 2010
IT Future and Final Blog Entry
Many people are afraid of new technology, and want to stop its growth because they feel that it will somehow replace human beings. I can see some possibility in these ideas, but it is a very small chance in my opinions. This scenario is unlikely in my opinion, mainly because currently I would see no ambition that machines would develop in order to replace humans. Mainly because they are machines and have no human ambitions of power. I personally believe that humans and machines must evolve together, and possibly even become interconnected in the future. This has already begun on a small scale in terms of medical implants, bionic limbs, and even technologies that can sink with people's brains like the Braingate example that we watched in class. Basically in my mind the possibilities are endless, and I hope to see marvelous advancements in my lifetime.
In this class I have learned much about a complicated topic. My knowledge of information systems was very limited before this course. Although we only scratched the surface of the topic in this class, I am glad that I was able to gain basic knowledge on this topic that I hope to expand on in the future. Information Technology is the way of the future, and in the interest of not being left behind I will attempt to learn as much as I can in order to one day utilize these advances and properly use information systems during the remainder of my college experience, into my professional career, and wherever else life takes me.
Tuesday, April 20, 2010
Eastern Tuition
Eastern Michigan University seems to be the only college not going up on their tuition fees. Michigan State, U of M, Saginaw Valley etc. are all going up by the thousands each year. But Eastern is the only school not going up on their tuition fees. It seems things are getting worse with the economy and even more worse with the amount of money schools need. Last year Eastern did raise it's fees but for this year they have decided not too. On July 14 2009 Eastern put a 3.8 percent tuition increase for the fall semester.
Sunday, April 18, 2010
Future discussion and Singularity
Ray Kurzweil was the first person to talk about how far we are taking technology. Though in his book we talk about in class he contradict himself a few times he predict adroids will be here. He also stated that a computer with the calulation power of a human should be out around 2029. He also states that aorund 2049 a computer will be out witht he calulation power of every human being on this planet. With these kind of prediction I kind of want to see these could of computers.
For our debate and last day of class ended with every person having their own opinion of what singularity might bring as a result. For me only the future can tell what might happen with singularity. Like investigating DNA that in the 80s was suppose to take 200 years only took 13 years once started. Technology seem to not have anything blocking it from imporving. Once the android come that can think for itself we can only hope for the best.
link to singularity: http://www.singularity.com/
Ethics
Ethics also rely on your religious beliefs. Which further make it harder for people to define ethics. As we saw in class with our disussion every person their had their own opinion about ethics. In honest opinion to me ethics should not really have a definition. I feel thjis was because when a word have a definition most if not all the people agree with it. Ethics is one of the highest debating topic we has humans have.
Ethics also can conflict with itself. A major conflict with ethics is doing something wrong to help someone in need. For example and true story a man's mother was dying from a disease, the doctors had what she needed, but they could not afford it. The man got tired of seeing his mother suffer that he went to the store where they had it with a gun. So, he rob the store, but only took the medcine for his mother this is very hard to say if it is ethical or not because in my opinion I did not see this being unethical.
link to ethics: http://www.scu.edu/ethics/practicing/decision/whatisethics.html
Saturday, April 17, 2010
12th (and final) post
IT Future
Case studies
In the last two weeks of class, we've been learning with a different approach: case studies. Case studies are a great way to approach a topic, and with how class has been these past two weeks, I think most agree that these have been great. Case studies are meant for many levels of learning, and some institutions, like Harvard, use them almost exclusively. These can give us an interesting idea of how real-world business works.
The nature of our case studies and why class discussion was so interesting is that they pull together more concepts than just the foundations of IS - they require us to think about many soft concepts. An analysis of how people work, ethics, and our world's current problems gives many more people the ability to make a contribution with so much to talk about. When more specialized fields become topics is when people often have less to say.
One idea I thought might help is to introduce case studies earlier in the course. Maybe establishing a consistent frequency of case studies would help - it might give everyone a day to look forward to. What do you think about case studies?
Friday, April 16, 2010
Thanks Eastern
In the beginning I really didn't know what to expect from this course. I don't know that much about computers in general. I thought that it was really interesting to hear about how they originated. I didn't know all the things that they can do. I am glad that I wasn't thrown into a class that depended on knowing and understanding all of the technical terms. I think that if you are going into this field it is important but since this is just a requirement for me I was a bit relieved.
The Human Genome Project
As a benchmark for the importance of reverse engineering the brain, and the time needed to do it, we discussed how quickly the human genome project was completed. The human genome project was expected to be completed in 2090 and instead was finished in 2003. That's difference of 87 years. This figure also gives us an idea of how much we underestimate the future.
The human genome project was also a global effort by several advanced countries. This also shows how large the flat factor plays into how we estimate, or underestimate, the future to be. Outside of the human genome, the project also focused on: E. coli, the fruit fly, and the lab mouse. These three projects play a large role in understanding the human genome and other scientific areas of humanity.
However, the human genome only is complete in general terms. There are several specific areas of the human genome, such as telomeres, that are still incomplete. As of today, the best estimate for how complete the human genome is 92.3%. The most interesting fact is that the human genome contains roughly twice as much DNA as the roundworm, and almost the same amount as mice. This data will lead us to create very important cures as we obtain the technology to practically use it.
12th blog
In the last two classes we talked about ethics and the future, respectively. Ethics have always been a topic briefly covered in various classes. Thics are a system of moral principles, or the rules of conduct recognized in respect to a particular class of human actions or a particular group or culture.
A big threat to the future of any business is failure in ethics. Business ethics as an issue are much much more powerful than globalization or even the internet, and can destroy a business almost instantly. The standards of business ethics however, are changing rapidly. What was good is becoming bad and vice-versa, very fast.
A good site to get some information of this is: http://www.globalchange.com/businessethics.htm
When I think about our little heated discussion that we had the day in class about things like the future and what it had in store for us technology wise I think about robot people. I think that we need to take a serious look at whats going on around us and see that we need to keep ourselves in check and be sure that we are not relying on technology too much. it has been said that 97% of children today play video games for more than three hours a day. I feel that this is just one example of how we abuse technology. I believe that kids use this as something to do instead of going outside or reading, and parents use this as a babysitter. The technology is so great that we have now, it can help us in so many ways. I just don't want people to not be able to help themselves. I want to see the day where people are able to use a chip to store information in their brain, but I don't want to see the day when everyone uses this chip to do everything for them. I cant say I will ever use a chip but it is all just part of history that I can see happen. Technology will continue to advance whether I want it to or not, I just want it all to be used the right way.
The Singularity and Ethics
Is the Singularity really coming and how will it affect us? These are two unanswerable questions so far, unfortunately. Let's assume that the Singularity is indeed approaching and will arrive at a rate consistent with Kurzweil's predictions. The ramifications are unclear because it is difficult to predict what this will mean for us as humans. Certainly, it would appear troubling, particularly to those of us who have watched the Terminator, that this prediction includes a reality in which machines exceed the intelligence of the average human brain. In this scenario, machines could theoretically build increasingly intelligent machines without human intervention or help. If these machines have no inherent concept of ethics or morals, isn't there a strong likelihood that human and machine goals could become incongruent? In this scenario, isn't it also possible that machines could at some point decide that humans aren't "useful" or "necessary"?
Case Study
I have found the case studies to be very interesting. There are things that I didn't know before reading them. I really liked the one about Medivet. I thought that it was a cool concept to be able to go to a clinic that is easily accessible where ever your day may take you. This would also be helpful if there were an emergency and they didn't have the equipment necessary or if you weren't near your usual vet.
These three words; integrity, commitment, and truth I believe play a large role in ethics. I believe that all people should be ethical, not just when it comes to business but in all aspects of life. i feel that it is important to be ethical because in a lot of ways to be ethical it means you need to be fair and that it the best way to make sure all people are being taken care of equally. If no one was ethical people would go out of business so fast because like it has been brought up in class before people don't want to do business with people who are unethical. I think that it can be hard at times to be ethical because in a business you are there for pretty much one thing only, and that thing is to make money. There are a lot of things that are unethical that you can do to make money. Like say you had a service that some people wanted and some people actually needed. You want all of the business that you can get so for the people who just wanted your service you give them and okay price on it, just to make sure that they use you. For the people who need it you charge an arm and a leg for because you know that it's not a choice for them. It is completely unethical, but you make more money. There will always be people out there who don't care at all about ethics and will do anything they can to get to the top, in some situations they are way more successful and this is what makes it hard to try to do things in an ethical manor, but I still believe in ethics no matter what.
Technology: everyday life
Ethics and philosophy
One of the questions asked in class Tuesday was "why be moral"? Well, to answer this question, it would take well over 100 years to entirely answer the question in full. As for a brief synopsis, I would say that ethics affect more than you would think it does. So, why be moral? Who's going to care? Who's it going to benefit?
Early economists proposed a theory on human behavior. They believed that all human behavior is totally motivated on self-interest. The decisions made by the person is basically recognizing the better difference between the cost and the benefit of buying that product. There are definitely people out there that make decisions not based upon a benefit/cost analysis. Such examples would be turning a lost purse into the lost and found if possible, risking their own life to save someone else, or donating blood.
Aristotle created an ethical theory labeled as self-realizationism. In his definition, he claimed that when a person realizes their full potential, they will be content and have a happy life. It is those who don't realize their full potential who live unhappy lives. "Nature does nothing in vain", as Aristotle boldly stated.
So, as you can see, the ethical decisions made in your life completely has an effect on the rest of your life, and sometimes affects on others. The need to be ethical is most people's drive forward. It is especially important to a business. If you want to succeed you have to help other people do the same. If you can recognize the difference between good and bad ethics, you can do pretty well for yourself. So I don't think there's anything that the human race can't solve somehow, as long as people are moral and make moral decisions.
Sources: Sahakian, William S. & Sahakian, Mabel Lewis. Ideas of the Great Philosophers. pp 33-35. Barnes & Noble Books (1993). Kwan ,Micahel. Beyond the Rhetoric. Official Blog.
Dell Theory
According to Thomas Friedman, the Dell Theory of Conflict Prevention is a theory he presented in his book The World Is Flat. The theory stipulates: No two countries that are both part of a major global supply chain, like Dell’s, will ever fight a war against each other as long as they are both part of the same global supply chain. I took a History of the 20th Century class last semester and the professor brought up a similar theory where the rise of global IT will decrease the possibility of a global war, like WWI and II, of ever happening again.
When I first heard of this I thought this could not be true. I always figured there was an inevitable WWIII that was bound to happen. Now, I realize that will never happen now. Once every country in the world becomes industrialized, there will be no more world wars. What country would want to go to war with another country that was full of their own businesses? If this were to happen, the global economy as a whole would collapse. An example would be if the US were to attack Japan like they did in 1945. Japan is now full of US-owned businesses and many businesses that thrive because of the US market. If all imports and exports to and from Japan were blocked, the US economy would take a major hit.
This is just another reason why global IT is such a success. However, the Dell Theory is not a guarantee, it means governments of these countries will have very heavy economic costs if they go to war. A real life example of this theory is the China and Taiwan relationship. Both countries have strong supply relations with each other so a war seems very unlikely today.
Last blog
I wrote several blogs about what was going in class, but the blogs that I really like are ones that other's did on the same topic. Blogging has been a great way too see how others think about everyday topics. I noticed that RFID chips are blogged about a lot and I thought it was cool to be able to see how many different opinions there were. From those who can't wait too those who would never even consider it. It seems like blogging is the most civilized way to have a discussion with a large group of people. this way you can wright what you think and anyone who wants can comment. This is a lot easier because it lets everyone talk at the same time, but also stay on topic.
The networking possibilities are also good. You can easily see how people would get to know each other. I can also see this being used to talk about your self or even as a way to get to know those your work with. It seem like the more your interested in the topic the better the discussion and the better quality information. This kind of naturally will let you see out of all those your blogging with who is interested in the same things you are. I blogged about some of the worst inventions which was something I really thought was interesting. I saw that there ended up being a lot of comments on it. Blogging has a lot of different uses and I think that if you use it wisely it can really help you succeed in the sharing of ideas.
Accountability and Data Collection
Privacy
Now, I realize my thoughts for a global community watch are pie in the sky ideas. I have obviously not put much thought into this as I have no idea in what way this could be enacted. However, with the technology at our fingertips today it would not be hard. Think of all the crimes that have been caught on tape by the average citizen, we have seen it everywhere from Seinfeld to YouTube. Perhaps if we took this role as a small scale vigilante a bit more seriously we could drastically reduce the incidence of crime while still retaining some of our privacy.
Electronic Mail
While in class there was a discussion on how everything is starting to be electronic. Weather you're shopping, checking your bank statement, applying for a job, it's done electronic. But the thing that I really started to focus on was electronic mail. I remember when I was younger and I would open all of the "junk" mail, pretend it was mine, and I would tell my mother how I couldn't wait until I was old enough to receive mail. Well now that I'm old enough the only mail I'm receiving are packages from when I order something online.
Brain Embedded Chip
The idea of having a computer chip embedded into your brain will probably scare you at first, but take a look at the pros and cons before you think ahead.
The positives of having this chip inserted is that it can directly give you information without the use of a computer, it will put doctors at ease when all they have to do is read the chip instead of blood tests and such, and you will never have to carry around a laptop anymore.
Think about it; all the information in the world is stored right inside your brain. You won’t need anymore resources. Not to mention, people that have Alzheimer's in there family. It scary to know that disease may affect you and your family. They may look for any alternative to prohibit that disease from manifesting. So, they look to technology. The computer can enhance someones memory and retrieve information from the long term memory. From a business standpoint, competition will be down and there will be no need for many things. Lastly, a doctor would no longer need charts to see your medical history. They can just have a scanner(kind of like the one in common stores). Also, they share the information telepathically. We never know....
Thursday, April 15, 2010
Tenure Sucks
With the end of the semester upon us, I suddenly realized that I was quite a bit behind on my blogging for Dr. Drake's class. Partly due to my affection for procrastinating, and partly due to my disaffection for everything blogging represents, I have found myself in a dire situation regarding my "A" in IS215. So tonight, my vain attempt to catch up on my blogs begins. I desperately hope that Dr. Drake will take note of my flood of high quality posts and grade me compassionately.
For my first blog (fifth or so of the semester, first in my futile attempt for salvation) I want to discuss a topic that was only briefly mentioned in class the other day and isn't really pertinent to information systems. I was astounded to learn that EMU professors achieve tenure after only five years. According to Dr. Rick Camp, PhD, organizations are making an aggressive move away from non-performance based rewards (i.e. tenure, seniority) and emphasizing rewards based solely upon performance. Now I understand that this tenure policy has likely been in place since the Normal School and probably isn't easily influenced by fads. However, the facts show that these kind of benefits do not foster high performance. Often, highly qualified and passionate professors are laid off due to budget cuts and because they lack the tenure of less qualified, un-motivated older professors. Likewise, professors that, quite frankly, suck, are protected because they have managed to hold a job for five years.
Although EMU is a highly touted financial institution, many people forget to recognize that EMU and other state U's have a HUGE business side. This being said, I would like to augment my poor subjective argument above with this statement. TENURE IS BAD BUSINESS. It ensure that the school will have to pay out the most expensive contracts (that continue to grow yearly as tenure protects them) while leaving the inexpensive ones to the whim of budget cuts. Not very thrifty in these hard times. It also protects the jobs of professors that are no longer professing at the peak of their abilities. Meanwhile, young talent with promising careers ahead don't get a chance.
I know this has nothing to do with IS but it's a topic that means something to me. I have had a slew of horrendous professors in my three years here. Most of them are on tenure and just don't give a shit anymore. My other reason for this diatribe is the wonderful class discussion we had the other day. I've had teachers that have been teaching here for over 20 years fail miserably to engage a class half as mus as Dr. Drake did yesterday. I complement John Drake for guiding a terrific discussion and encouraging students to think critically and speak their minds (we've all had those quiet, awkward classes where getting students to speak is like getting a horse to sit). He is a perfect example why EMU should get with the times and consider revising their tenure policy in order to attract new and young teaching talent to our school. Lord knows it might engage our students and help improve our 33% graduation rate.
Music Industry
Robot vs Human
Today, I came home and was pleasantly surprised to see that there were a few articles featuring robotics in the May 2010 issue of Discover Magazine. One in particular caught my eye, "Machine Dreams". Discover Magazine's editor in chief, Corey S. Powell, moderated a discussion between four roboticists: Robin Murphy of Texas A&M, Red Whittaker of Carnegie Mellon, Javier Movellan of U.C. San Diego, and Rodney Brooks of MIT; concerning different areas of robotics.
One question that was posed to Rodney Brooks, founder of iRobot (which makes the Roomba), was "...you've talked about four goals that robot researchers should be aiming for. What are they?" In essense, his response was that they should aim to create robots that have:
1) "...the object recognition capabilities of a two-year-old child", as in being able to recognize that two different objects are still the same type of thing (their example was that two different chairs are still both chairs)
2) "...the language capabilities of a four-year-old child"
3) "...the manual dexterity of a six-year-old child" (ie. the ability to tie shoes)
4) "...the social understanding of an eight or nine-year-old child", the ability to take social cues from others
The panelists also spoke of how robots could be most beneficial to humans. One way that robots can be used is in space explortion. They don't have biological requirements (food, air, water, warmth) that humans have. Also, they are better suited to dangerous or lengthy missions than humans are. This could open up more possibilities of exploration in areas that humans would not be able to easily access (or access at all), like caves on the moon, the moons of outer planets, or even possibly further reaches of space beyond our own solar system.
Robots can also be used in other areas that could pose a threat to human survival, or cause harm to humans. For example, they are already being used for testing for explosives, search and rescue in floods, under deep rubble, and in low altitude fly overs in places with heavy tree cover.
Of course, their are a multitude of ways that robots can be useful to humans. They can assist in medical procedures, do manual labor, work on assembly lines, really the options are pretty much endless. Will they replace humans? In some areas, yes. Progress always requires a shift in the way that work is done though, and new jobs will likely open up in other areas as technology expands the horizons for what is possible.
Cranial Chip
Motion Sensing Technology and Dance
UK National DNA Database
In class on Tuesday we discussed the implications of incorrect data collection and I brought up the example of The United Kingdom National DNA Database (UK NDNAD, officially known as the UK National Criminal Intelligence DNA Database). The UK NDNAD was the world’s first national DNA database and now stores the genetic details of about 5.3 million people. It is currently the second largest DNA database in the world, second only to FBI’s database in the US. The NDNAD is comprised of DNA samples that are collected from crime scenes, and by everyone who is ever, “arrested on suspicion of committing a ‘recordable’ offense,” in the United Kingdom, which, as of 2006, includes infractions such as not wearing a seat belt. In 2005, the UK NDNAD held the profiles of over 585,000 children under the age of 16, since “everything from littering to skipping out on bus fares,” qualify as grounds to collect a DNA sample.
When new DNA profiles are added to the NDNAD, their information systems automatically search for matches between individuals and genetic material from crime scenes. In recent years, in attempts to solve more crimes, the database has begun doing familial searching, searching for family members of the wrongdoer if this person’s DNA is not in the database. Also, in attempts to target out violent citizens who are most likely to commit violent crimes, behavioral analysts within the UK’s Violent Crime Directorate use the information systems of the NDNAD to construct psychological profiles on people in the database based on all the information ever collected about them. These potentially violent people are then monitored.
In regards to what we discussed in class, it has been recently found that half a million names in the NDNAD have been mislabeled with incorrect names. So potentially, you could be monitored for the rest of your life and assumed violent when you have never done anything wrong. This is an obvious case of the problems that arise with incorrect information in databases and how influential and destructive these mistakes can be.
http://people.howstuffworks.com/future-crime-database.htm
Did IS 215 Just Get Interesting?
The last few class periods have been full of discussion based on case studies and analysis that Dr. Drake issued to us. These cases created class discussions full of similar and opposing viewpoints. And for the first time this semester, made us as students think just a little outside the box. When I first signed up for IS 215 I imagined that we would be in front of computers learning ins and outs of technology. I thought I may learn short cuts for different programs or how to effectively use search engines based on specific topics. I was obviously wrong.....
As all of you know IS 215 is a lecture driven course designed to inform students about the history, current use, and future of Information Systems. Unfortunately for me, this was a subject that I had no interest in....until the last couple of weeks of class.
The case studies have made me look at Information Systems in a completely different way. Just when I thought the class won't teach me anything, I may be proven wrong by the end of the semester. During our in class discussions I've had the opportunity to listen to a lot of different thoughts and opinions...some a little more dramatic than others. But as a student, I believe that is where the real learning takes place; when I can hear the thoughts and opinions about a subject from those around me.
I think I may be surprised when I walk out of room #221....or I may be getting a little ahead of myself.
iPad
The Amazon Kindle is more for book readers and comes with 3G wireless service. The iPad comes with AT&T 3G service that cost $29.99 a month. There is no fee with the Kindle. The iPad can be purchased with the AT&T wireless service and come equipped with WI-FI connectivity. The Kindle has a read out loud feature to it, where you don't have to actually read, it will read the book or document out loud for you. The iPad does not have this feature. The Kindle is basically has a black and white display, and the iPad has a color display, but the Kindle can be read in direct sunlight unlike the iPad. The Kindle is priced at $259 for the 6" and holds 1500 books and the 9.7" at $489 and holds 3500 books. The iPad will cost anywhere from $499 to $829 depending on how much storage space you think you'll need.
I could go on and on with the comparisons but I won't. In my opinion I feel the Kindle is the better buy but I feel more people will purchase the iPad because of the brand name, capability to watch videos, and its ability to download apps. In the long run, the user will probably spend more money with the iPad by trying to keep it updated with the latest and greatest apps and other functions.
Which One are You?
On Tuesday one of the last subjects we discussed was Precautionary and Proactionary. More often than not Proactionary people and thinkers create and do things without fully understanding the capabilities and consequences of their thoughts or creations. Precautionary people and thinkers are all about understanding the consequences of what "COULD" happen.
The creation and use of the Atomic Bomb during WWII is a good example of Proactionary thoughts put into action. The creation and use of the Atomic Bomb was deemed necessary to end WWII. Today, everyone knows the consequences of using the Atomic Bomb but in 1944-1945 the future consequences of their actions was either not thought out all the way to conclusion or my favorite "couldn't truly be understood. My grandfather was a WWII veteran so I have always argued that the atomic bomb was the used appropriately because it indeed did end the war. In the last few years, with thoughts of terrorism popping up left and right, I have thought what may have transpired if the US never created or used the atomic bomb. But that is a thought of speculation and opinion...
Global Warming is an example of Precautionary thinking. Scientists have been studying the Earth and its atmosphere for years. They understand what hurts our environment and what benefits it. Scientists know what we as a people need to do to prevent the melting of polar ice caps, the extinction of wild life, and the thinning of the ozone layer. Activists and scientists have been arguing for the reduction of the burning of fossil fuels and to limit the off shore oil drilling rigs. They have pushed for the used of hydrogen fuel cells, wind energy, electric power, and so forth. This principle psychologically defines those who assess the consequences before preparing to turn a thought into reality.
These two principles were created to differentiate between two types of thinking, those who go for it no matter what, and those who step back in fear of what may happen. Precautionary and proactionary minds surround elected officials for the specific purpose of presenting both sides of an idea.
So which one are you? Are you a person who engages in a thought or activity before thinking about what could happen or are you the type of person who thinks all the way and even beyond the conclusion of your actions?
technology
I think sometimes companies get carried away by trying to come with the next best thing, the highest technological gadget. I know all business are in business to make a profit, but to what extent. Are they weighing profits against legal fees due to law suits? It sure seems like it. Not sure of how true it is, but I can remember hearing that cell phones will cause brain cancer. I'm sure cell companies are saying if its true, we'll deal with it when it happens.
America is cutting the cable
Well this is new, nearly 800,000 households have dropped their cable company. Don't worry though, because cable companies still have about 100 million subscribers in the united states. But still to have 800,000 household cut cable is major. A new report claims that many people are watching T.V via Netflix, and watching it online. Many people these only watch specific channel and specific programs, so why have to pay $100-$200 a month when you can just get online and watch it on your own time? I personally will not cut the cord, because I like watching sports.
Sports is probably the only thing I watch on T.V, and Lost but thats about to end pretty soon. Sports is a major factor in people's decision to cut cable. It's the most live event you watch on T.V, and you can't watch it online because you'll probably know who won, so there is no need to watch it. Netflix and online streaming is still pretty new to us, but what's going to happen in 10 years? will we see the end of television? will eveyone be watching their stuff online? T.V is a major social gathering activity for friends and family these and I feel like that's going to fade once Netflix and web streaming becomes more popular.
12 and Final Blog Post
Wednesday, April 14, 2010
IT in Retail
There have been many advances in technology for the retail industry. Due to all of the advances, the National Retail Federation created an organization focused primarily on IT. This organization is called the Association for Retail Technology Standards (ARTS). ARTS was established in 1993. According to their website, ARTS is a retailer-driven membership organization dedicated to creating an open environment where both retailers and technology vendors work together to create international retail technology standards.
In the organizations 17 year history, ARTS has developed four standards of significance: the Retail Data Model, Unified Point of Service (UnifiedPOS), ARTS XML schemas to integrate applications within the retail enterprise, and standard Requests for Proposal (RFP's) to guide retailer selection of applications and provide a development guide for vendors. The Retail Data Model was created to allow retailers to select applications from vendors whose applications were developed using the Model. The UnifiedPOS is a device interface standard that allows retailers to add new devices to sales floor terminals with minimal, if any, program changes. ARTS XML schemas greatly reduce the time and cost of integrating applications. Standard RFP's developed by a committee of retailers, vendors and consultant from previously used retail RFP's greatly reduce the cost of developed RFP's and ensure the retailer review all the potential application features and functions to select the "right" application for their business.
Since the IT industry is growing more and more everyday, ARTS is their to help retailers adapt to the new technology being introduced. Now retailers do not have to worry how their company will be able to "catch up with the times."
The Future
Privacy
It is hard today to keep most of our information private. With all of this new technology, and how much information that we enter into our computers, many different people could potentially have access to all of that information. In the article that we had to read for class, they mentioned a study where in Florida, they put cameras into the bathroom at least one day a week. The legislature said that they were doing it for the good of the people, but most people found this an invasion of privacy. Was it? To play on both sides, the people had a point. I would not like to have cameras in the bathroom that I used because I feel that it is an invasion of privacy. But the legislature said that in the long run, it would be a benefit to the people. Is it okay for them to do that when they are trying to make everything better?
Another point about privacy is should certain things be kept quite from the public involving people working in the state and local government? Another example that was in the article was about how when information was put together from different systems in the 1970s, they found that high paid city employees had some fines that were unpaid. Of course, the general public was angry, but did they have a right to know? Privacy is something that we take for granted, thinking that everyone will respect your privacy. But what should be allowed for people to see and what should be kept private?
11th Blog
Here are some of the ways Skype has revolutionized communication between global teams:
- The basic teleconference equipment is expensive; therefore, companies were able to buy only a limited number of devices. The use of Skype through computer speakers has reduced the need for expensive teleconference equipment.
- Teleconference bridges in most developing countries don’t exist. In places where they do, the expense is out of reach for typical global teams. Skype’s conference facility has reduced the need for teleconference bridges for conference calls with five or fewer participants.
- Skype has increased the ability of the teams to collaborate, since they can use voice and IM simultaneously. Written communication is still the preference for sharing source code snippets, Web site details, phone numbers, etc.
- Skype also allows for impromptu communication. Due to the time zone difference between countries like US and India, there’s a need to communicate during early morning or late evening hours. Often phone calls during late hours catch people during dinner or after they have gone to bed. Skype allows users to see who is online before the call is made.
- The superior quality of Skype calls has encouraged users to communicate using voice rather than email or IM. This not only results in better communication but also helps in building better relationships between team members who are geographically distributed.
Freeware
Everyone likes to get things for free, and free computer applications and software are no exception. So, what sorts of computer applications can you get for free? A whole bunch of them! So many so that I will only be able to barely scratch the surface of the offerings.
Google, the king of free applications, offers everything from free maps, email, calendars, blog space, and a ton more. The nice thing about their applications is that they are from a well known and trusted source, and that many of the applications can be tied together with other applications for ease of use (for example - GMail and Google Calendar).
There are also free, open source applications such as those offered by Mozilla. The most popular of which is the Firefox web browser. Firefox offers a solid browser that has many free applications users can add for a customizable web surfing experience.
Most people are familiar with Google and Firefox though. So, what other types of free software are available? There are free anti-malware programs such as Panda Cloud Antivirus Free Edition, free file transfer and sharing such as Bit Torrent, free office programs like Google Docs, and so very much more. PC Magazine recently had an article titled "The Best Free Software of 2010" that showcases some of the best free software that is available today. It includes everything from maps, conferencing, file sharing, security, networking, video, and quite a bit more.
Of course, as with anything that you put onto your computer, it is always a good idea to be careful with what you download. For every great free program, put out there by a trustworthy source, there are always the scores of others that will damage your computer, add spyware, or in some other way be dangerous or annoying. It's always best to do a little research first, and know what you are getting yourself into.