In an upcoming special anniversary issue of FCW, we will take a closer look at five information technologies or technology trends that have had a significant impact on government in the last twenty-five years – that would be back to 1987 for the mathematically challenged among us.
Impact can be measured by the degree to which it transformed how government works, how government serves citizens, or how agencies acquire and manage the IT resources needed to perform their missions. Moreover, we want to single out those technologies whose impact is still felt today in one or more of these areas.
Of the following list of technologies, tell us which five you think had the greatest impact as defined above? Do you think we missed any?
Internet – There really isn’t an argument here, is there?
E-mail – Electronic messaging is such an integral yet glamour-less part of our current work lives, it’s hard to remember a time when we actually had to talk to each other or commit ink to paper to communicate. How much of the worker productivity gains in the past twenty-five years can be tied to e-mail?
Posted on Apr 05, 2012 at 7:28 PM1 comments
The consulting firm Gartner issues its list of the top 10 strategic technologies for the coming year each October, and has for the past five years. Viewing those lists side by side presents an interesting perspective on the issues that presumably have occupied CIOs during a given time period.
Gartner defines a strategic technology as one with the potential for making a significant impact in the next three years. That impact might include a high potential for disruption to IT or business operations, the need for a major dollar investment or the risks associated with being late to adopt the technology.
How well do Gartner’s priorities match those at your agency? Green IT, which appeared for three years in a row, dropped off as an agenda item starting in 2011. Interestingly, security appears only once in five years — in the form of activity monitoring on the 2010 list. Has security really become that routine?
On other fronts, cloud computing first appeared on the 2009 list and has held a spot every year since. No surprise there. And of course, all things Web have also made perennial appearances, from “mashup and composite apps” in 2008 to “contextual and social user experience” in 2012.
Posted on Feb 21, 2012 at 7:27 PM0 comments
As federal technology executives gain experience using virtualization technology to reduce the number of physical servers eating up space and power in their datacenters, many are starting to discover that virtualization can also offer similar efficiency and cost-cutting benefits for their business continuity capabilities.
Mike Rosier, senior systems administrator at Fermi National Accelerator Laboratory, explains how he and his colleagues are using virtualization to create a more resilient IT infrastructure for the lab for a fraction of the cost of traditional business continuity options.
Federal Computer Week: Can you give provide an overview of the general use of server virtualization at your organization?
Mike Rosier: At Fermilab, we've been using modern server virtualization technologies for over 5 years. In fact, I'm sure we were utilizing earlier implementations back in our mainframe days.
Some of the early reasons we decided to invest in virtualization were to address power and cooling issues in our computer rooms. This was at a time we were trying to keep up with the growing demands for development and test systems. The procurement costs for dedicated physical servers were also eating into our yearly budgets.
Posted on Jan 20, 2012 at 7:27 PM1 comments
There’s a growing wind in the sails of microservers, a new type of datacenter computer that is extremely energy efficient and tailor made for the cloud- and Internet-style workloads that are increasingly common at agencies and other big enterprises.
Dell joins some smaller players and recently introduced its initial line of microservers, Intel has begun shipping the first of several processors designed specifically for use in microservers, and Facebook officials say they have big plans for the diminutive computers.
In a recent story we covered the reasons why microservers are expected to make a huge splash even though they buck one of the hottest trends in enterprise computing, the use of virtualization software to consolidate data processing chores on to fewer, more powerful servers. If you are interested in learning more about microservers, read on for some links to key news items, case studies, analysis and technical discussions.
Microservers put a low-power microprocessor, like those designed for smartphones and tablets, on a single circuit board, then pack dozens or hundreds of those cards into one server cabinet that provides centralized power supply, cooling fans and network connections.
There are many opportunities for using microservers to drastically reduce datacenter operating costs, including:
* Web applications with high volumes of small discrete transactions, like user logins, searches, checking e-mail and simple Web page views.
* Running hosting services that provide a dedicated physical machine for each application or user.
* Creating grids or clusters in which multiple server nodes work in parallel on a specific task.
* Environments that need to reduce the energy consumption and physical footprint of their data center servers.
Some folks think microservers will eventually dominate most cloud datacenters. John Treadway, global director, cloud computing solutions at Unisys, makes a persuasive case for microservers on his personal blog CloudBzz. He predicts that microservers will replace bigger servers running virtualization in most commercial cloud datacenters by 2018, with internal enterprise datacenters on the same path though a few years later.
To see what a large-scale cloud datacenter packed to the gills with microservers looks like, click on the YouTube video available on this webpage from Data Center Knowledge. It’s taken inside French hosting company Online.net’s facility and features early microservers built by Dell. The section showing the microservers begins about 1:20 into the video.
Facebook is one of the highest profile players in the U.S. to endorse the microserver approach for large scale data centers. In this article from PCWorld, Gio Coglitore, director of Facebook labs, lays out the rationale for the social networking giant’s plans to move to microservers, with reasons that include energy efficiency, avoiding virtualization vendor lock in and increasing system resiliency.
One of the better write ups that clearly explains the value proposition for microservers is available from one of the industry’s earliest players, SeaMicro. It’s a case study about Mozilla, the group that organizes the development of the Firefox Web browser, and their use of microservers. One of the most interesting parts of the article describes how Mozilla officials calculated, among other cost and efficiency metrics, the energy required to perform a certain processing task, in this case an Internet HTTP request. They concluded that microservers used one-fifth the power per HTTP request than a traditional server.
Correlating power consumption with the work output of an IT system is a more advanced and meaningful way to calculate datacenter energy efficiency than the metrics CIOs now use most often. Last year I wrote a story about efforts to increase the use of these more sophisticated metrics. [
If you really want to get into the weeds about the relative performance of different processor approaches and their suitability for varying types of workloads, there are a couple of good papers that size up those debates.
One paper from a group of researchers out of Carnegie Mellon University evaluates the use of clusters of low power chips like in microservers deployed in what is called a Fast Array of Wimpy Nodes. The FAWN approach can be a much more energy efficient option for many types of workloads, but not all. The researchers note that the power costs of large datacenters accounts for up to half of the three-year total cost of owning a computer.
On the other hand, Google released a paper from one of its researchers that details the drawbacks in certain situations of arrays of wimpy chips. What happens is that wimpy-core systems can require software applications to be specially written to run on them, resulting in extra development costs that can take a big bite out of the energy savings.
Posted on May 27, 2011 at 7:27 PM0 comments
We’re working on a story for FCW about the Department of Veterans Affairs “Blue Button” Web application that allows veterans to download their personal health information from the department’s MyHealtheVet site.
We’re looking for veterans who have used the VA’s Blue Button to share their opinions about the application’s usefulness with the FCW community. If you’ve given this application a try, you can use the Comment button below to tell us about your experience.
One veteran who checked out the application and wrote about the experience on a blog last fall reported being distinctly underwhelmed by the experience. “Here’s the cream that floats to the top, the icing on this cake, the best of the best; If you download and install the Blue Button to your personal computer, you will be able to securely access and download and print and share all the data that you yourself put in to the system,” wrote Jim Strickland, a veterans’ advocate.
Since then, the Centers for Medicare and Medicaid Services launched its own Blue Button application on its MyMedicare.gov website. That feature lets 47 million Medicare beneficiaries view, download and print their medical records.
Posted on Apr 08, 2011 at 7:27 PM0 comments
Finally, thanks in no small part to the recent Telework Enhancement Act, it looks like a lot more government offices will be giving telework a try. Previously resistant managers are coming on board (for the moment, anyway), identifying positions for telework eligibility, dealing with equipment needs, and developing agreements about employee performance and expectations.
Of course, most telework programs start with the best of intentions, but not all march on to meet great success. Employees who abuse the telework privilege with lackluster performance hurt productivity and can poison office morale. They can also jeopardize management support for the telework program. Sometimes managers have to work with poorly conceived telework policies, so they lack tools that could help them bring wayward employees in line.
So, tell us, what are some of the mistakes that employees or their managers can make with telework? And what can they do to avoid falling into the same traps?
Posted on Mar 11, 2011 at 7:27 PM2 comments