Thursday, November 18, 2010

Digital Distribution: The Beatles

The Beatles have come to iTunes. In one of the most insignificant events of the year, Apple hit newlines everywhere. Why is it insignificant? The Beatles were one of the most popular bands in history, but every fan already owns a CD. They can use iTunes to rip their CD for free. Even so, people are making a big deal over the new offering. Why? It's symbolic. Now that the Beatles have adopted online sales, the transition to online delivery is solidified. However, the transition won't stop with music, and it shouldn't.

Over the years, entertainment groups (TV, Movies, Music, Books) have struggled with online. How do you protect your Intellectual Property in a digital world? How do you incorporate advertising (and make money) online? How do you keep online from cannibalizing your traditional markets with higher margins? 

The IT world has been back and forth on DRM issues. After the disastrous Sony DRM debacle and iTunes' more successful (but equally frustrating) 5 copy limit, online music has moved away from DRM. Ironically, music downloads have decreased: showing that the best way to combat digital piracy is universal availability and low costs. 

Hulu has revolutionized online TV. It offers large selection of high-quality shows, many the day after it airs. Most surprisingly though, is that it has done this while using traditional advertising methods. Just like over-the-air broadcast, the viewer's program is interrupted 2-3 times in a 30 minute period to show advertisements. The introduction of Google TV is likely going to use and increase viewers in this online distribution. 

Oddly, most groups don't like the idea of going online. While most companies have learned that if they don't put their material online, others will (in a less legitimate way), they put it online with a delay. Ten years ago, you had presented an almost free way to create a second channel for a television station, the executives would have jumped on the opportunity. Today they are hesitant to adopt, but the problem is one of perspective. Executives need to think about getting their material to the viewers in as many ways as possible- the money making opportunities will follow.

Friday, November 12, 2010

Will social networking replace email?

On the one hand, Social networking goes hand in hand with email. In many ways, it's email 2.0. It facilitates communication, helps you keep track of your contacts, helps you connect with new people that you could benefit from corresponding with, and easily fits into a cloud-based file sharing/knowledge sharing/collaboration solution. Lately, social networking platforms have become bloated with advertising, games, music, etc. Business social networking platforms would be to facilitate communication, not to turn a profit, and therefore would not have these complications.

On the other hand, there are many social networking platforms and none of them work together, while email is truly universal and cross-platform. Writing an email is simple and mimics the physical process of sending someone mail, while a new system redesigns the communication process. If we think of it in terms of a business process redesign, we can see it being very risky. The new system would require a learning curve on one of the most basic processes, and might make a firm less competitive in the short run. 

We could definitely see not only the possibility of such a transition, we can see the benefits. However, the technology needs to advance a little more (towards standardization) before making any serious attempt to transition away from email. Even with a more advanced social networking platform that allows communication with other company's systems, this transition would be a difficult and expensive one.

Sunday, November 7, 2010

iPhone Alarm Clock Bug

It sometimes seems silly for us to worry about becoming overly dependent on technology. You'd think "worst case scenario, I can always go back to the way things used to work." Today's revelation of the iPhone alarm bug prompts us "what do you do when a technology you rely on stops working?" 

For those unfamiliar, Apple has announced that the iPhone's alarm program does not recognize the daylight savings time change on repeating alarms. What this means is that as people with iPhones wake up on Monday morning all across the world, they'll do so one hour late. Read the story here.

After hearing this, I thought "OK, as long as I know about it, I can set a different alarm." Then a weird thought hit me, I don't have another alarm clock. For years, I've relied upon my cell phone as my morning alarm: it's always with me, I don't have to remember to set it, and the wakeup sound doesn't make me want to hurt someone. Sure I have other clocks, but none of them have alarms. What to do?

An alarm clock. It seems so simple until you don't have one. It's easy to become overly dependent on a technology. Assuming you're acting rationally, that technology will make your life better, but you should always have a backup plan. It's why I'm headed to the store to buy a cheap alarm clock. 

Sunday, October 31, 2010

The Right Mindset for Managing IT: A Critique

This week, I started work on my article presentation. The article is "The Right Mindset for Managing Information Technology." The article is riddled with holes, mis-understandings, and mistakes. Since my job for class is to present the material in my presentation (not to argue it's flaws), I'll use this media to present my thoughts.

The article compares Japanese IT management to US IT management and can be broken down into five main points:
  • investment in IT follows a logic of strategic instinct rather than strategic alignment
  • Focus on performance improvement as a benchmark rather than ROI
  • Don’t adopt technology for technology’s sake.
  • Focus on integrating the IT people into the business, and the business people into IT
  • Don’t focus on eliminating the user, focus on enhancing the contribution of the user
Starting with the idea that IT should follow strategic instinct rather than strategic alignment, we have problems. Broken down into it's fundamental parts, this is an argument to be reactive rather than proactive with technology. The case present a couple of success stories to demonstrate the validity of their points, but they lack a critical understanding of each of these situations: reactive IT strategies based in short-term improvement of operational goals require less investment, less redesign, and have stronger support of upper management. It's like comparing LSU's chances in football against Auburn or Alabama to LSU's chances against a team like UL-monroe. It's a compelling, but incorrect use of statistics. The proper first step should be "Get upper management's support."

The second point involves focusing on performance improvements. While more legitimate than the first point, it too involves a massive misunderstanding of the IT function. The best IT should be invisible to the user; constant IT changes don't allow users to learn the system and hinder productivity. In addition, many IT processes don't contribute to productivity at all, but rather prevent breakdowns. How do you measure the performance improvements a disaster recovery plan adds to the company? How do you calculate the performance improvements of improving the scalability of an enterprise system for a company that wants to expand but hasn't yet? How do you quantify the performance improvements of replacing soon-to-be-obsolete parts to prevent future breakdowns? Most of the IT functions should be invisible to the user and focus on maintenance rather than performance improvement. 

The third point is one of the worst, don't adopt technology for technology's sake. The very nature of capitalism demands that people dream big and pursue those dreams. A manager in the west doesn't adopt technology for technology's sake, he listens and shares a vision with the inventor. In seeing a potential competitive advantage, he invests in bringing it to fruition. Many of these attempts fail, but the same can be said of any R&D field. Should the pharmaceutical industry stop investing in research because most of it doesn't make it into a drug? No, because the pay-off of inventing the next Lipitor is worth it. The authors hide behind the idea that we can just do what the Japanese do, adopt technology later in the product cycle when it's not cutting edge. They are correct, except for the fact that technology is not a natural occurrence. If companies don't invest in developing this technology, it won't come to be for the japanese to piggy-back later. I read an article reviewing a state of the art technology a few years back, the author astutely stated, "When I hear complaints that it’s too expensive, I grin and hope that all my competitors feel that way."

The fourth point is the only correct one of the group, but not for the reasons they explain. The article talks about getting the IT managers more familiar with the business. This is a definite short-coming of the field, but I look at it from a different angle: get the business people familiar with IT. Business managers today are intimidated by IT; they don't think about how to use IT to improve their productivity. No amount of "business seminars" or "organizational bonding" is going to get a computer engineer thinking about the business side of the organization. The people that go through the steps of the business process on a day-to-day basis need to be thinking about IT, automation, and redesign. They don't need to know the how of IT, they just need to know what it can do. The article unwittingly supports this point of view. It explains that the Japanese rotate business managers through the IT function, they would then take that know-how into every other position they hold.

The last point takes a limited view of technology in saying don’t focus on eliminating the user, focus on enhancing the contribution of the user. The ideal IT solution is a fully automated system in which your vendors put inputs in the right place, out pops a finished product, and your distributors come and pick it up. Everything in the middle is superfluous. In our society, it has become so taboo to focus on eliminating the need for jobs, that a point such as this one is heralded as "common knowledge". In reality labor is expensive and often inefficient. The primary thing that separates the abilities of a worker from the abilities of a computer is judgement and strategy. Since our computers are imperfect right now, systems should automate everything it can, and enhance the contribution of the user in the other areas. Just because a computer can't do something right now is no excuse for ceasing to move in that direction. I'll refer you to a computer chess program. Twenty years ago, it was foolish to think that a computer could possibly beat a good human player. Today, we have chess games that routinely beat the chess grand-masters. Technology is ever-evolving, don't use current limitations to limit your belief in what technology is capable of. 

Friday, October 22, 2010

Google TV

There are lots of things to love about Google. Their technology is hands down the best in its field. They don't charge consumers for use. Their inventions and innovations have literally transformed how we use the internet. Now they're focusing on other technologies.

I've already written about Android, today I'm looking at Google TV. Speaking on the Apple TV back in June, Steve Jobs stated:
"The TV is going to lose until there's a better--until there's a viable--go to market strategy. Otherwise you're just making another Tivo. It's not a problem with technology, not a problem with vision, it's a fundamental go to market problem."

Asked if it made sense to partner with a major cable company the way Apple partnered with AT&T to bring the iPhone to market, Jobs said, "Well then you run into another problem. Which is: there isn't a cable operator that's national. There's a bunch of cable operators. 

"And then it not like there's a GSM standard where you build a phone for the US and it also works in all these other countries. No, every single country has different standards, different government approvals, it's very… Tower of Bableish. No, balkanized."

Jobs concluded by saying "I'm sure smarter people than us will figure this out, but that's why we say Apple TV a hobby; that's why we use that phrase." 

 Going against Jobs' gloomy outlook for the TV industry, google is trying to prove itself the "smarter people" with their new release. 

Google is releasing Google TV both as a set top box and as a built in standalone for TV software. It's like Android handsets in many ways. TV companies no longer have to put in their half-hearted effort to build a usable UI on TVs, Google's built a great one. 

More importantly, Google's solved the issue of how do you compete with Cable providers. By offering value to TV manufacturers, Google TV gets installed on your TV before the consumer gets a hold of it. Then Google offers video content using the internet as a delivery mechanism rather than cable. By running side-by-side with cable, Google doesn't force consumers to choose which is better (or lay out huge sums of money to try a new technology). In the abstract Google has solved the go to market problem.

We'll see if it catches on; I know I can't wait to try it.

Friday, October 1, 2010

Centralizing the IT Function

In this article by Bill Brenner, he looks at the growing trend of outsourcing IT security operations. He makes the compelling case that due to specialization these security firms can provide better data security for less money.

This isn't a new trend in IT. Small businesses used to host their own email servers. Today, even some big businesses outsource that. Virtually every IT process is subject to economies of scale. As time goes on, that only becomes more true.

Take a maintenance position. Not too long ago, a person had to sit at a computer to troubleshoot it (or direct someone who was sitting at it). With the rise of remote desktop software, any computer can be controlled from anywhere. I can sit at home and troubleshoot problems in 3 different countries if I want. 

Long story short, with the exception of higher level IT planning, all aspects of IT are best served when outsourced to a centralized business.   

Friday, September 24, 2010

Open Source Standardization

Android, the first major open source mobile OS, has found huge success on devices from virtually all manufacturers. People now are running into the problems that arise when a manufacturer edits Android before distributing it. While the idea of these edits is ideally to improve the user experience, it is mostly used to cripple the OS: remove tethering, draw attention away from functionality that is not supported, etc. In theory, successful open source projects will put pressure on the entire industry to improve, but only if people have access to use that open source.

Consumer calls lately have been to require all manufacturers using Android to allow consumers to opt for a standardized Google build over the custom manufacturer build. Google's CEO Eric Schmidt disagreed explaining that "if we were to put those type of restrictions on an open source product, we'd be violating the principle of open source." 

Eric Schmidt gets it wrong. Open source is about being open to the end user, not a manufacturer. These kind of perspectives hold open source software back from their true potential. By requiring that manufactures give consumers the option of using a stock Android build, Google ensures that customization by the manufacturer works towards the good of the user.

Friday, September 17, 2010

The Rise of Robotics

When computers were first marketed to the public (and corporate) sector, they were of limited value. They could only accomplish a handful of things and they were prohibitively expensive. As time went on, the capabilities of computing equipment increased while the price decreased. At a certain point, computers crossed the feasibility threshold such that their price could be justified by their abilities. The increased demand fueled the innovation that led to modern computers. 

Computers contribute value because they allow us to manage our information and automate repetitive knowledge-based tasks. It has changed the very fabric of our culture. Today it seems ridiculous to attempt to do much of anything without a microprocessor: laptop, computer, cell phone. I believe the next IT transition will be in robotics.

Once the exclusive arena of large corporate manufacturing plants, robotics have become much more available to smaller companies. The rise in demand has driven more research, and while it hasn't yet risen to the level of the personal computer, it seems to be following in those footsteps. Robotics today is drastically different from robotics 10 years ago; they are significantly more capable. It's just a matter of time before a handful of these machines cross the feasibility threshold and begin showing up in our daily lives.

When they do make their appearance, it will fall under IT. Robotics are just computers that can gather information and move around. It will redefine both the capabilities of IT and job an IT professional does.

Take a look at the links below for a few very cool examples of where robotics is headed:

Thursday, September 9, 2010

Technology patents and imitation

It seems that with each new software release, competing products get closer and closer together. One company's competitive advantage is quickly copied and released by rival companies. This isn't a new trend; in fact, imitation is a cornerstone of "innovation" in the digital age. Let's start from the original digital knock-off, the graphical user interface. Originally pioneered by the Mac in 1984, it replace the DOS command-line systems with a more user friendly interface controlled by a mouse. This game changing idea was quickly copied, titled "Windows" and put on the market by Microsoft in 1985.

From the beginning, people viewed technological intellectual property as different from tangible property. Normally a firm could patent a product and the relatively slow change in their market would allow competitors to sell a competing differentiated product. In the fiercely competitive technology markets, a product without all the bells and whistles of the competition will fall behind. Therefore, large numbers of people have justified relaxing intellectual property restrictions.
Imagine how the patent trial would go if a businessman started selling knock-offs of patented pharmaceuticals and tried to use excuses common to IT. Sure, the knock-offs came after the original, were inspired by it, accomplish fundamentally the same thing, but are functionally different because of this insignificant change. Assuming that dispute got to trial (it never would), it would be a landslide victory for the pharmaceutical industry.

Yet today, it's common to see patented products that are indistinguishable from one another. Take the Nook and the Kindle, they're essentially the same product. Look at the current generation game systems, they all now have motion controlled sensors. TiVo got pushed out of the market because cable companies were able to provide a "non-patent violating" substitute in their cable box DVRs.

It seems like in technology fields today, patents are filed not to stop competitors from copying, but to slow them down. If they have to stop and figure out how to make the product "functionally different" the inventor might get an extra month of exclusive use. As a result, patents today are ridiculous. They either attempt to obscure the details of a product or they make the patent so broad it could be used to describe most technologies on the market. 

I don't think there's much we can do about it, other than have some personal integrity. I just find it interesting to contrast Intellectual Property in IT with Intellectual Property in other fields.

What does this have to do with the mac perspective? Everything. Companies no longer must one-up each other with invention and innovation. There is little profit in creating the game changing software, and quite a bit of risk in trying. It may seem like this kind of environment is good for consumers because it helps keep prices in check, but this is not the case. We are setting ourselves up for an era of stagnation, trade secrets, and unspectacular progress. In this case, competition is making the consumer lose. 

Friday, September 3, 2010

The Social Networking Frenzy

Everyone's on Facebook. Social networking is the technology trend at the present. It doesn't matter if you're a high school student, a professional stock broker, or a soldier deployed overseas, the prevalence of high speed internet throughout most parts of the world has spurred people to flock to social networking sites. While not the first social networking site, the fad started with MySpace. Eventually the elegant structure, intuitive UI, and lack of tacky music blaring from each page you visit drew people to the superior choice, Facebook. 

The true value of social networking comes from having a digital forum that is comprehensive. Facebook draws much of it's value from the fact that most everyone in the country not only has a page, but regularly updates it. With social networking as with most things, structure is nice, but content is king. With Facebook, the users supply the content; the users supply the value. 

Seeing the success of MySpace and Facebook, other social networking sites have attempted to get in on the action. Twitter showed just how ADD people could really be. LinkedIn attempted to split the business professionals off from more "consumer" networking sites. The list goes on, but a visit to wikipedia's ever growing list will show you there is a social networking site for just about every special interest. 

Now Apple has thrown their hat in the mix. With the release of iTunes 10, Apple is releasing "Ping", their new iTunes based music social networking program. You will often see me praise Apple on this site, but make no mistake, I'm also one of their toughest critics. Ping is a monumentally stupid idea.

Each time a social networking site draws a user to post on it rather than a central or localized site (lets say Facebook), it splits the available social networking content. In addition to people on Facebook not having access to that information, the poster doesn't have access to the people on Facebook. No users win; remember, content is king.

Apple saw the opportunity to integrate the latest fad into their software, and they correctly took it. However creating their own social networking program was the wrong way to go. The smart move would have been to work with other social networking sites, allowing people to put iTunes information on the profiles, walls, status updates, etc. Instead, they've chosen to compete with the market leaders in an already heavily saturated market at the expense of the users. 

You may have a different take on this, but I know that I will never use ping. However, if Apple had written an itunes Facebook app/extension with the same abilities, I'd have music posts on my profile right now. 

Wednesday, August 25, 2010

What the Mac Perspective is

As a long time mac user, I've often been the subject of scorn from my peers. When I took my first job as a computer technician, this only got worse. But I'm proud to say that I've always used macs. I remember as a kid working on an Apple Quadra running Mac OS 7. I remember marveling at how big an upgrade Mac OS 9 was over it's predecessor. I remember sitting in awe as I watched demos of the original Mac OS X. I still follow Apple's presentations and I'm as much of a "Fan Boy" as they come.

"Why haven't you ever switched to Windows?" I've often been asked by my IT peers. It's a good question. For a long time most video games were only available for PC, and while that was tempting, it's not true anymore. For a long time you needed a PC to network with other PCs, and while that definitely discouraged buyers, it's not true anymore. For most of history, PCs have been cheaper than Macs, but while you can still buy a very cheap stripped down PC, much of the price delta between comparable models is gone now.

There is less reason for me to switch now then there ever has been, but why didn't I switch when the computers didn't have much software, didn't work with the majority of other computers, and cost a premium? Aside from the early childhood indoctrination, I chose not to switch because Apple understood something that Microsoft is just now starting to see.

With every piece of technology there is a tradeoff. The technology may be incredibly functional but not user friendly, it may be incredibly user friendly but not functional, or it may fall somewhere in between. The goal of software companies (including Microsoft and Apple) is to maximize both of these, but even today it is a balancing act.

I'm under no illusions that the Mac operating system is the most capable one available—I award that distinction to UNIX. But from the beginning, Apple understood that you had to build an interface that people understood, and then pack it as full of functionality as you could (rather than building the functionality, cramming it together, and calling it an interface). The most useful tools in the world will be ignored by most people if they can't figure out how to use it.

Take VOIP for instance. VOIP has been around for decades, but it wasn't until companies such as Skype made it user friendly that people started taking advantage of it.

This is the essence of the Mac perspective. It's understanding that how people will go about using a product is just as important a consideration as what it can do. It's a perspective that goes way beyond just Mac vs PC. It can be used to evaluate every type of technology, and should be used to evaluate every type of user.