2009's Hottest Tech Trends

Our annual list of hot technologies includes a few that exploded on the scene recently plus some that have been simmering for years and just now are coming into their own.

802.11n: The 'N' Stands for 'Now'

It's been a long time coming, but 802.11n is finally here. And that means wireless LANs now are a viable replacement for wired LANs.

Without delving too deeply into the past, we all remember the war-driving horror stories: buggy Wired Equivalent Privacy implementations, 802.11b technology that promised 10Mbps but barely delivered 5Mbps, 802.11g technology that promised 50Mbps but barely delivered 20Mbps, and prolonged standards battles over 802.11n.

Yes, Wi-Fi technology has been a tad disappointing. On the other hand, we have come to expect and appreciate wireless networks in our homes, coffee shops, airports and hotels. And employees, particularly younger ones, impatiently await the wireless workplace.

In Network World's groundbreaking test of 802.11n access points and controllers, 802.11n technology delivered impressive data rates of 250Mbps per access point. In addition, it delivered solid performance numbers on latency and jitter, which means it can support such real-time applications as voice and video. The systems we tested had a variety of enterprise-level features, such as Power over Ethernet; dynamic radio-frequency control; QoS; and such security functions as intrusion prevention and detection, Wi-Fi Protected Access 2, and stateful firewalls.

When it comes to new construction, 802.11n should be the default choice. The choice gets tricky, however, when it comes to existing networks. If you already have some 802.11a/b/g/, keep in mind that running a mixed network will result in significantly reduced bandwidth. In our tests, mixed-mode throughput was 24% of the throughput in an all-802.11n network.

Nevertheless, no matter how you decide to roll it out, 802.11n is ready for the enterprise.

Unified Communications: Getting Warmer

Unified communications is one of those technologies that's seemingly forever been on the verge of exploding but has never really become hot.

Maybe the reason is that the term "unified communications" means different things to different people. To the telecom manager, it means replacing the tried-and-true PBX with an IP-PBX from a traditional telephony hardware vendor or from an open source start-up - or maybe even jumping to a software-based platform from Microsoft.

To the desktop user, it means switching to an IP-based phone and taking advantage of a variety of such UC-based productivity applications as audio- and videoconferencing, instant messaging and presence, integrated voice and e-mail.

To BlackBerry-toting mobile workers, UC means being able to use the mobile devices to perform all the business functions associated with an office phone. They want calls made to their desktop phones to bounce to their mobile phones. They want to dial into the office and have their e-mails and voice mails read to them. They want all their devices to sync up seamlessly.

The good news is that all these features are available today from vendors including Alcatel-Lucent, Avaya, Cisco, IBM, Microsoft, Nortel and Siemens.

UC technology isn't setting the world on fire, but it is spreading inexorably across enterprise networks. Nemertes Research recently found that just 16% of the 120 companies it surveyed are doing nothing with UC. More than one-third (36%) are in an initial planning phase; 28% have a limited deployment of specific applications that make up the technology, or a full deployment to a limited number of people; and 19% have developed their strategies and are implementing the technology companywide.

What are the drivers of UC? Years ago, the decision was all about saving money and about moves, adds and changes. Last April, when Gartner asked early adopters to list the three biggest benefits of deploying UC, the top answers were employee collaboration, employee productivity and communication for distributed sites. Lower total cost of ownership came in last.

Data Protection: It's the Data, Stupid

In today's world of mobile workers, teleworkers, thumb drives, BlackBerries and social-networking sites, IT executives can't worry about devices - they need to focus on protecting data wherever it is.

The obvious place to start - considering that an estimated 5,000 laptops are stolen or lost each year - is the laptop hard drive: It needs encryption. (Read a column about the Drive of shame.)

Software vendors and such open source projects as TrueCrypt offer whole disk encryption across all operating systems, and Microsoft offers disk encryption in Vista, so IT executives have no excuse for not encrypting laptop data. In addition, such hardware vendors as Fujitsu, Hitachi and Seagate Technology offer hardware-based disk encryption.

Another trouble spot is e-mail. A variety of e-mail encryption methods are available, but all of them run into the same problem - they require the recipient of the encrypted e-mail to go to a secure server and enter some form of identification before they can gain access to the decrypted e-mail. For most people, this is a nuisance that rises to the level of a deal-breaker.

Another way to approach e-mail security is through data-loss prevention. DLP tools scan outgoing e-mails for such information as Social Security numbers, sensitive keywords or other possible breaches. Then they flag the offending e-mail. Companies dictate how offending e-mails are handled: They can be returned to the sender, bounced to an IT manager or encrypted.

DLP products, however, can be difficult to get right. That's because companies have to hammer out policies for determining which types of data need watching, what happens when an e-mail is flagged, and whether the individual user should be required to decide whether to encrypt specific e-mails or types of e-mails. For example, the CIO might not appreciate it when he sends an e-mail to the CFO and it gets flagged, bounced back or held up.

Other potential problem areas - everything from thumb drives to smartphones - abound. Nevertheless, vendors today are offering encrypted USB drives and business phones with encryption features. IT executives need to make data security a requirement every step of the way.

Green IT: A New World View

Can you afford to be green? Can you afford not to? Those are the questions IT executives face as they grapple with the notion of environmentally friendly computing in the midst of a crushing global economic downturn.

For many companies, going green simply means cutting data-center power expenses. By now, the basic principles of doing that are pretty well understood - consolidate servers, set up hot and cold aisles, optimize airflow, raise the ambient temperature a few degrees.

Such changes can save money, but green IT doesn't stop at the data-center door, and companies can't just pass the buck to facilities managers. IT departments can and should undertake a number of green initiatives - which won't break the bank, either.

First, persuade your company to measure its carbon footprint. This seems like an obvious place to start, but you can't address the issue in a logical, analytical manner if you don't have a starting point.

Once you have a sense of that footprint's size, you can set goals for reducing it by an achievable amount - say, 5% or 10% over a certain period of time. There are a number of actions you can take, including these:

-- Power down unused servers or desktops.

-- Use energy efficiency as a purchasing criterion when you replace older equipment, including network gear, servers and UPS.

-- Adopt recycling and reuse programs.

-- Think about alternative sources of energy.

-- Encourage videoconferencing to reduce air travel.

-- Cut back on ground travel.

-- Pressure vendors to demonstrate that they have green strategies.

Finally, don't be fooled by "green-washing." These days, every vendor claims to be green. Be sure to verify those claims.

Network Access Control: After the Shakeout

Network access control has been a hot, fun topic for the past couple of years.

Epic standards battles pitted Cisco against Microsoft, each having its own terminology and approaches. And who could forget the Trusted Computing Group, which, with its own architecture, acted as a wild card?

Then there was the horde of third-party vendors offering to handle a company's NAC needs if it didn't want to wait for Cisco and Microsoft to deliver on their promises.

Last year was a turning point for NAC, however. The standards battles appear to have been resolved, and everything looks like it's falling into place. Customers apparently decided to wait for Microsoft to deliver its NAC products - and that left many third-party vendors out in the cold. A lot of them went under, including Caymas Systems and Lockdown Networks.

And because Network Access Protection (NAP, Microsoft's version of NAC) comes with Vista and Windows Server 2008, deciding to go with Microsoft has become a no-brainer for many customers. NAP represents a clear choice, rather than a technology that requires extensive research, RFPs, product tests and evaluations, and so forth.

NAP even proved itself in a recent product evaluation Forrester Research performed to determine which NAC tools would solve real-world deployment problems. Microsoft came in first, followed by Cisco and Juniper Networks.

This year the questions for customers will be where do we deploy NAC, and how many NAC features do we turn on? Most customers today are using NAC just to control guest access. That's important, but the technology can do more. On the pre-admission side, it can scan user devices, determine whether they are clear of viruses, check to see if patches have been updated and quarantine the device if security conditions aren't met. On the post-admission side, it can make sure that a clean machine remains that way, and that users access only those parts of the network to which they have authorization.

These important functions are ones that every IT exec should be implementing.

10 Gigabit Ethernet: A Switch in Time

In 2001, when 10 Gigabit Ethernet switches were introduced, the average per-port cost was $39,000, according to IDC.

Today, a 10G Ethernet port costs less than $4,000, which makes 10G Ethernet switches affordable for the enterprise wiring closet or data center.

With ongoing data-center server consolidation, not to mention the needs of service providers and high-volume Web sites, standards groups and vendors are hard at work on 40 Gigabit Ethernet and even 100 Gigabit Ethernet. For now, however, 10G Ethernet is the industry standard, and customers are flocking to 10G Ethernet switches. Switch-based 10G Ethernet port shipments grew by 140% in 2007, Infonetics Research reports. Worldwide revenue for 10G Ethernet services and equipment will hit nearly $9.5 billion by year-end, a 30% increase from last year, the firm predicts.

If your Fast Ethernet boxes are becoming stressed, this might be the time to move to 10G Ethernet. Per-port prices are coming down and feature sets are going up. A recent Network World test of seven 10G Ethernet switches found these products offer not only powerful packet-pushing capabilities but also 802.1X authentication, enhanced multicast support, protection against denial-of-service attacks and IPv6 support. The test demonstrated that these switches have extensive management and security features, which are just as important as how many packets they can move per second.

Virtualization: Beyond the Server Farm

By now, you've most likely implemented some level of x86 server virtualization. So, the question of the moment is this: Does data-center virtualization on x86 boxes represent the end of your virtualization efforts or just the beginning?

What about storage virtualization? What about desktop virtualization? What about application virtualization? What about virtualizing all your data-center hardware including Unix boxes and mainframes?

Those are the key, long-term virtualization questions facing IT executives. Once you've started down the road to decoupling the underlying technology infrastructure from the services you're providing to the business, doesn't it make sense to extend that strategy across the enterprise?

If you're inclined to agree, the next logical step would be storage virtualization, because you're dealing with another technology residing in the data center. The advantages of creating a virtual storage pool include lower-cost data migration, easier storage-resource management, common replication services and the ability to maximize and extend your storage resources.

Client virtualization, which comes in a variety of options, also offers real benefits. In the hosted virtual-desktop setup, applications are hosted on a server and users work on thin-client machines. This would be ideal, for example, in a call center.

In another version of desktop virtualization, one physical machine is virtualized. Here, separate business and personal zones could be created on mobile workers' laptops for security and compliance.

Or multiple operating systems could be run on a single PC. This scenario would apply to engineers, for example, who might be running a specific Unix or Linux-based technical application but using Windows for e-mail and other basic applications.

Most companies today are in the first stage of virtualization, says Gartner analyst George Weiss. This means they're consolidating and virtualizing servers as cost-cutting measures, typically with a single vendor.

The next phase would be using virtualization technology for the dynamic allocation of resources across servers. And the final phase, which won't occur for several more years, is heterogeneous virtualization, the ability to move workloads dynamically across hardware platforms.

Cloud Computing: Proceed With Caution

As we arrive at 2009, cloud computing is the technology creating the most buzz. Cloud technology is in its infancy, however, and enterprises would be wise to limit their efforts to small, targeted projects until the technology matures and vendors address a variety of potentially deal-breaking problems.

First off, let's define cloud computing. Gartner says it is "a style of computing whose massively scalable and elastic, IT-related capabilities are provided 'as a service' to external customers using Internet technologies."

The two most commonly cited examples of cloud offerings come from Amazon.com and Google, both of which basically rent their data-center resources to outside customers.

For example, Amazon's Elastic Computer Cloud (EC2) lets customers rent virtual-machine instances and run their applications on Amazon's hardware. Other services under the EC2 umbrella include storage and databases in the cloud. Amazon uses Xen for virtualization and offers customers a choice of Linux, Solaris or Windows operating systems.

The pitch is that customers can take advantage of Amazon's expertise in running large data centers, that customers pay only for the compute and storage resources they use, and that Amazon can scale up or down easily, depending on the demand.

That's the most basic level of cloud computing - infrastructure in the cloud. In this scenario, the customer is aware of and makes choices concerning the infrastructure itself.

The next level is cloud computing as a Web development platform. The best example is Google's App Engine, a place where Web application developers can upload code (as long as it's written in Python) and let Google's infrastructure take care of deploying the application and allocating compute resources.

The third level is running enterprise applications in the cloud. A cloud vendor could host an enterprise application and take responsibility for that application's availability and performance. Gartner predicts e-mail will become one of the first enterprise applications that move to the cloud.

How is that different from software-as-a-service (SaaS)? Without getting too tangled up in semantics, SaaS typically refers to a specific vendor - Salesforce.com, for example - offering its application to multiple customers in a hosted model. Theoretically, a SaaS vendor could use the cloud infrastructure to host its applications. Also theoretically, a cloud provider could host anybody's application.

That brings us to the ultimate cloud scenario, in which these "private" clouds owned by such companies as Amazon and Google melt into one giant, public cloud that contains all the user's data and applications and is accessible anytime on any device.

That's a long way off, however. In addition, the potential roadblocks are many. They include issues of licensing, privacy, security, compliance and network monitoring. A final potential stumbling block is that enterprise applications tend to be customized and intertwined, with one system feeding into or reporting back to another. That makes it pretty tough to pluck out an application and run it in the cloud without affecting every related application.

So for now, keep an eye on the cloud, but keep your feet firmly planted on the ground.

Web 2.0: Learn to Live with It

The Web 2.0 phenomenon is unstoppable. Employees are turning in droves to blogs, wikis, mash-ups, social networking, crowdsourcing and other variations on the Web 2.0 theme. A recent Yankee Group survey found that 86% of non-IT workers are using at least one consumer Web 2.0 tool at work. As younger workers enter the enterprise workforce, access to Web 2.0 technologies will become only more of a given.

Watch a slideshow on 12 tips for safe social networking.

The challenge for IT executives is how best to harness Web 2.0 technologies in a way that's secure; serves such basic enterprise functions as collaboration; and adds to worker productivity, revenue generation and overall business benefits.

The possibilities are endless. A Gartner list of Web 2.0 applications includes answer marketplaces, collaborative product and service design, community-driven self-service, crowdsourcing, idea engines and prediction markets.

Many employees are using such social-networking sites as LinkedIn, MySpace or Twitter to communicate with peers and customers. In addition, a growing number of vendors aim to help companies set up and manage enterprise-grade Web 2.0 applications. For example, WorkLight offers Java-based software that will help authenticate, encrypt, store and manage Web 2.0 applications (see "10 start-ups to watch in '09"). Face Connector (formerly Faceforce) is a mash-up that brings Facebook profile and friend information seamlessly into Salesforce CRM. Socialtext 3.0 provides social networking, wikis and customizable home pages for the enterprise.

That's just the tip of the iceberg. Key is identifying an application that fits with the culture of your company, then making it available and watching as the community takes off.

Subscribe to the Daily Downloads Newsletter

Comments