Using Powershell to check password properties…

2017-07-27T00:01:02+00:00 December 5th, 2013|Uncategorized|

Thanks to Mike Driest, who did most of the testing and documentation on this issue…

One of the many benefits of Coretek’s Virtual Clinical Workstation (VCW) solution is the ability to allow users to run their clinical applications through a “thin” client.  A thin client is a small, lightweight computer that contains very little hardware; just the minimum to allow them to connect to more high powered servers on which their applications run.  

Some of these thin clients run a smaller, “lighter” version of  Windows called Windows “Embedded”, while others don’t run Windows at all!  These devices — while being very inexpensive and convenient due to their small footprint (space wise and energy wise) — pose certain technical challenges in a Windows environment.  One such challenge is the ability to change a user’s Active Directory domain password.

We had to do some troubleshooting recently in our lab to determine whether we had the correct settings to allow an Imprivata “service” account to facilitate a domain user password change from a “zero” client – a device that does not run any form of Windows.  As part of our testing, we had to ensure that the test account’s password was expired; to do this in a timely manner, we set the “pwdLastSet” attribute of the test account to ’0′ (zero):

To confirm that the password was indeed expired we used the following PowerShell command (requires the AD DS PowerShell Snap-In)

“Get-ADUser SamAccountName -Properties *”

You’ll see “PasswordExpired: True” and “PasswordLastSet” is blank. 

Before: 

 

After:

 

 I hope you find this tip helpful!

Publishing MS Word Viewer through Citrix XenApp 6.5…

2017-07-27T00:01:03+00:00 June 19th, 2013|Uncategorized|

Recently, I moved over to the Coretek virtualization team.  It’s a great opportunity to work with new technologies and implement them as part of our Virtual Clinical Workstation solution.  

Citrix is one of the technologies that plays a major part in the solution.  As part of a virtualization implementation we are working on, I was tasked with publishing the Microsoft Office viewers through Citrix XenApp 6.5.  Now, I have a pretty extensive background in software installation and configuration on Windows desktops; however, publishing them through Citrix was new to me.  

The way a published application’s files and registry keys interact with a desktop operating system are fundamentally different.  Therefore, I was surprised to find when I attempted to open a Word document in the published Word viewer by double clicking on the .doc file — the file itself would not open, but the Word viewer would only open up an “Open” dialog.

 

1.BrowseDialog

So, being pretty new to Citrix — and given that it’s a pretty complex application that is comprised of multiple policies, application configurations, and settings, which can redirect content and drive mappings — troubleshooting issues like this can be challenging (and of course, you must also consider all the AD policies applied as well).  In the end, the solution to this particular issue was pretty simple…

Fortunately, once I was able to rule out that content redirection or drive mappings might be the cause, I found a Citrix KB article that addressed the exact issue I was having.  You can read the article here: http://support.citrix.com/article/CTX128151

In short, I had to add a special parameter (“%**”) at the end of the Word Viewer’s Command Line in the Citrix AppCenter (where published applications are stored).  The default parameter contains only one asterisk (“%*”). 

 

2.Location.Command.Line

 

Hopefully this tip will help if you experience the same issue!

 

 

 

Top 10 Storage Virtualization Trends of 2010

2017-07-27T00:01:09+00:00 August 4th, 2010|Uncategorized|

The storage area network (SAN) is now an essential technology for many large and midsize enterprises. Over the years SANs have become more sophisticated as vendors have rolled out systems that deliver better storage utilization and functionality. Based on these positive developments, 2010 should bring new and interesting products in several key areas. Here are our top 10 trends to keep an eye on in the coming year — along with the insights of key IT managers who are looking to optimize their existing storage and virtualization strategies.

1. Integration of solid state with rotating media for higher performance and lower energy costs.
Product picks: EMC FAST, Fusion-io, Compellent Storage Center

In an effort to provide the best possible storage solutions, many storage vendors are looking for ways to marry the high performance of solid-data memory to the lower cost of rotating media. As prices continue to drop for all storage technologies — and as hard drives get faster and cheaper — vendors are specifically working to incorporate the latest solid-state drive technologies into traditional SAN arrays. EMC Corp. and Compellent both offer fully automated storage tiering, which is the ability to store data depending on the needs of the application. More-frequently accessed files are stored on faster-performing disks, while less-frequently needed files are moved to tape.

“We’re using the Compellent product as part of our new Savvis Symphony cloud infrastructure service offering,” says Bryan Doerr, CTO of St. Louis-based services provider Savvis Inc. “We like how it has a policy that sits between the application and the array to control how each block of data is written to the physical media, based on frequency of usage.”

Doerr is pleased that these decisions are made automatically. “We don’t have to map tables or keep track of what files are stored where, and that’s a very powerful benefit to us,” he says. “Compellent can move individual blocks from a low-cost and low-performing SATA drive to a solid-state drive for the most-frequently updated data.”

One of the more interesting products is a hardware accelerator plug-in adapter card from Fusion-io that can pre-cache data using solid data memory for SAN arrays and other large-scale storage applications.

2. De-duplication technology — on storage and backups — can help open unused space.
Product picks: EMC Avamar, Symantec/Veritas Netbackup PureDisk, IBM/Tivoli Storage Manager, NetApp FlexClone

De-duplication technologies can provide a powerful way to quickly reclaim storage and minimize backup jobs. When users first start applying these technologies, they’re frequently surprised at how much duplication actually exists. As depicted in Figure 1, with PureDisk software from Symantec Corp., users can drill into a backup job and see that they could save more than 95 percent of their storage by getting rid of duplicate data. This capability offers huge potential savings, particularly when backing up virtual machine (VM) collections and remote offices.

Part of the challenge when using VMs is dealing with the fact that they share many common files inside each virtual image — the boot files for the operating system, the applications and so forth. A de-duplication product can leverage this by making only a single copy of common files.

PureDisk is typical of de-duplication products in that it operates in two different ways. For starters, you can use a PureDisk client or agent that runs on each VM and reports the unique files back to the central PureDisk backup server. And PureDisk can also back up the entire VMware VMDK image file without any agents on the separate VMs. This offloads backup from the ESX server and enables single-pass backups to protect all the files — whether they’re in use or not — that comprise the VM.

“De-duplication gives us big storage savings,” says Chuck Ballard, network and technical services manager at food manufacturer J&B Group, based in St. Michael, Minn. “We have 30 machines, each with a 20GB virtual hard drive, on our SAN. Rather than occupy 600GB, we have about a third of that, and we can grow and shrink our volumes as our needs dictate. We use the

[NetApp] LUN copy utility to replicate our workstation copies off of a master image.”

Ballard stores his images on NetApp’s SAN arrays that have their own utility — called FlexClone — to make virtual copies of the data. “We had EMC and also looked at IBM, but both of them had limited dynamic-provisioning features,” he says, adding that a VMware upgrade that required 4.5TB on J&B Group’s old SAN now uses just 1.5TB on the company’s new storage infrastructure.

3. More granularity in backup and restoration of virtual servers.
Product picks: Vizioncore vRanger Pro, Symantec Netbackup, Asigra Cloud Backup

When combined with de-duplication technologies, more granular backups make for efficient data protection — particularly in virtualized environments where storage requirements quickly balloon and it can take longer than overnight to make backups. Backup vendors are getting better at enabling recoveries that understand the data structure of VM images and can extract just the necessary files without having to restore an entire VM disk image. Symantec Netbackup and Vizioncore vRanger both have this feature, which makes them handy products to have in the case of accidentally deleted configuration or user files. For its part, Asigra Cloud Backup can protect server resources both inside the data center and the cloud.

4. Live migrations and better integration of VM snapshots make it easier to back up, copy and patch VMs.
Product picks: FalconStor FDS, VMware vMotion and vStorage APIs, Citrix XenServer

VMware vStorage API for Data Protection facilitates LAN-free backup of VMs from a central proxy server rather than directly from an ESX Server. Users can do centralized backups without the overhead and hassle of having to run separate backup tasks from inside each VM. These APIs were formerly known as the VMware Consolidated Backup, and the idea behind them is to offload the ESX server from the backup process. This involves taking VM snapshots at any point in time to facilitate the backup and recovery process, so an entire .VMDK image doesn’t have to be backed up from scratch. It also shortens recovery time.

Enhanced VM storage management also includes the ability to perform live VM migrations without having to shut down the underlying OS. Citrix Systems XenServer offers this feature in version 5.5, and VMware has several tools including vMotion and vSphere that can make it easier to add additional RAM and disk storage to a running VM.

Finally, vendors are getting wise to the fact that many IT engineers are carrying smartphones and developing specific software to help them manage their virtualization products. VMware has responded to this trend with vCenter Mobile Access, which allows users to start, stop, copy and manage their VMs from their BlackBerry devices. Citrix also has its Receiver for iPhone client, which makes it possible to remotely control a desktop from an iPhone and run any Windows apps on XenApp 5- or Presentation Server 4.5-hosted servers. While looking at a Windows desktop from the tiny iPhone and BlackBerry screens can be frustrating — and a real scrolling workout — it can also be helpful in emergency situations when you can’t get to a full desktop and need to fix something quickly on the fly.

5. Thin and dynamic provisioning of storage to help moderate storage growth.
Product picks: Symantec/Veritas Storage Foundation Manager, Compellent Dynamic Capacity, Citrix XenServer Essentials, 3Par Inserv

There are probably more than a dozen different products in this segment that are getting better at detecting and managing storage needs. A lot of space can be wasted setting up new VMs on SAN arrays, and these products can reduce that waste substantially. This happens because, when provisioning SANs, users generally don’t know exactly how much storage they’ll need, so they tend to err on the high side by creating volumes that are large enough to meet their needs for the life of the server. The same thing happens when they create individual VMs on each virtual disk partition.

With dynamic-provisioning applications, as application needs grow, SANs automatically extend the volume until it reaches the configured maximum size. This allows users to over-provision disk space, which is fine if their storage needs grow slowly. However, because VMs can create a lot of space in a short period of time, this can also lead to problems. Savvy users will deal with this situation by monitoring their storage requirements with Storage Resource Management tools and staying on top of what has been provisioned and used.

Savvis is using the 3Par InServ Storage Servers for thin provisioning. “We don’t have to worry about mapping individual logical units to specific physical drives — we just put the physical drives in the array and 3Par will carve them up into usable chunks of storage. This gives us much higher storage densities and less wasted space,” says Doerr.

Citrix XenServer Essentials includes both thin- and dynamic-provisioning capabilities, encoding differentials between the virtual disk images so that multiple VMs consume a fraction of the space required because the same files aren’t duplicated. Dynamic workload streaming can be used to rapidly deploy server workloads to the most appropriate server resources — physical or virtual — at any time during the week, month, quarter or year. This is particularly useful for applications that may be regularly migrated between testing and production environments or for systems that require physical deployments for peak user activity during the business cycle.

Compellent has another unique feature, which is the ability to reclaim unused space. Their software searches for unused storage memory blocks that are part of deleted files and marks them as unused so that Windows OSes can overwrite them.

6. Greater VM densities per host will improve storage performance and management.
Product pick: Cisco Unified Communications Server

As corporations make use of virtualization, they find that it can have many applications in a variety of areas. And nothing — other than video — stretches storage faster than duplicating a VM image or setting up a bunch of virtual desktops. With these greater VM densities comes a challenge to keep up with the RAM requirements needed to support them.

In this environment, we’re beginning to see new classes of servers that can handle hundreds of gigabytes of RAM. For example, the Cisco Systems Unified Communications Server (UCS) supports large amounts of memory and VM density (see Figure 2): In one demonstration from VirtualStorm last fall at VMworld, there were more than 400 VMs running Windows XP on each of six blades on one Cisco UCS. Each XP instance had more than 90GB of applications contained in its Virtual Desktop Infrastructure image, which was very impressive.

“It required a perfect balance between the desktops, the infrastructure, the virtualization and the management of the desktops and their applications in order to scale to thousands of desktops in a single environment,” says Erik Westhovens, one of the engineers from VirtualStorm writing on a blog entry about the demonstration.

Savvis is an early UCS customer. “I like where Cisco is taking this platform; combining more functionality within the data center inside the box itself,” Doerr says. “Having the switching and management under the hood, along with native virtualization support, helps us to save money and offer different classes of service to our Symphony cloud customers and ultimately a better cloud-computing experience.”

“If you don’t buy enough RAM for your servers, it doesn’t pay to have the higher-priced VMware licenses,” says an IT manager for a major New York City-based law firm that uses EMC SANs. “We now have five VMware boxes running 40 VMs a piece, and bought new servers specifically to handle this.”

As users run more guest VMs on a single physical server, they’ll find they need to have more RAM installed on the server to maintain performance. This may mean they need to move to a more expensive, multiple-CPU server to handle the larger RAM requirements. Cisco has recognized that many IT shops are over-buying multiple-CPU servers just so they can get enough dual in-line memory module slots to install more RAM. The Cisco UCS hardware will handle 384GB of RAM and not require the purchase of multiple processor licenses for VMware hypervisors, which saves money in the long run.

James Sokol, the CTO for a benefits consultancy in New York City, points out that good hypervisor planning means balancing the number of guest VMs with the expanded RAM required to best provision each guest VM. “You want to run as many guests per host [as possible] to control the number of host licenses you need to purchase and maintain,” Sokol says. “We utilize servers with dual quad-core CPUs and 32GB of RAM to meet our hosted-server requirements.”

A good rule of thumb for Windows guest VMs is to use a gigabyte of RAM for every guest VM that you run.

7. Better high-availability integration and more fault-tolerant operations.
Product picks: VMware vSphere 4 and Citrix XenServer 5.5

The latest hypervisors from VMware and Citrix include features that expedite failover to a backup server and enable fault-tolerant operations. This makes it easier for VMs to be kept in sync when they’re running on different physical hosts, and enhances the ability to move the data stored on one host to another without impacting production applications or user computing. The goal is to provide mainframe-class reliability and operations to virtual resources.

One area where virtualized resources are still playing catch-up to the mainframe computing world is security policies and access controls. Citrix still lacks role-based access controls, and VMware has only recently added this to its vSphere line. This means that in many shops, just about any user can start and stop a VM instance without facing difficult authentication hurdles. There are third-party security tools — such as the HyTrust Appliance for VMware — that allow more granularity over which users have what kind of access to particular VMs. Expect other third-party virtualization management vendors to enter this market in the coming year. (To get an idea of how HyTrust’s software operates, check out the screencast I prepared for them here.)

8. Private cloud creation and virtualized networks — including vendor solutions that offer ways to virtualize your data center entirely in the cloud.
Product picks: Amazon Virtual Private Cloud, VMware vSphere vShield Zones, ReliaCloud, Hexagrid VxDataCenter

Vendors are virtualizing more and more pieces of the data center and using virtual network switches — what VMware calles vShield Zones — to ensure that your network traffic never leaves the virtualized world but still retains nearly the same level of security found in your physical network. For example, you can set up firewalls that stay with the VMs as they migrate between hypervisors, create security policies and set up virtual LANs. Think of it as setting up a security perimeter around your virtual data center.

Amazon has been hard at work with Elastic Computing — its cloud-based, virtualization-hosted storage — and last summer added Virtual Private Cloud to its offerings (see Figure 3). This enables users to extend their VPNs to include the Amazon cloud, further mixing the physical and virtual network infrastructures. It’s also possible to extend any security device on your physical network to cover the Amazon cloud-based servers. The same is true with Amazon Web Services, where customers pay on a usage-only basis with no long-term contracts or commitments.

Microsoft has a series of new projects to extend its Windows Azure cloud-based computing to private clouds. They can be found at here and include ventures such as “Project Sydney,” which enables customers to securely link their on premises-based and cloud servers; AppFabric, which is a collection of existing Windows Azure developer components; and updates to Visual Studio 2010.

Some of these are, or soon will be, available in beta. But like other efforts, more federated security between the cloud and in-house servers will require improvements before these new offerings can be dependably used by most enterprises.

Two new entrants to the cloud computing services arena are Hexagrid Inc. and ReliaCloud, both of which offer a wide range of infrastructure services, including high availability, hardware firewalls and load balancing. With these companies, all cloud servers are assigned private IP addresses and have persistence, meaning that users treat them as real servers even though they’re residing in the cloud. Expect more vendors to offer these and other features that allow IT managers to combine physical and cloud resources.

9. Better application awareness of cloud-based services.
Product picks: Exchange 2010, Sparxent MailShadow
It isn’t just about networks in the cloud, but actual applications too, such as Microsoft Exchange services. The days are coming when you’ll be able to run an Exchange server on a remote data center and failover without anyone noticing. Part of this has to do with improvements Microsoft is making to the upcoming 2010 release of its popular e-mail server software. This also has to do with how the virtualization and third-party vendors are incorporating and integrating disaster recovery into their software offerings. An example of the latter is MailShadow from Sparxent Inc. This cloud-based service makes a “shadow” copy of each user’s Exchange mailbox that’s kept in constant synchronization. There are numerous cloud-based Exchange hosting providers that have offered their services over the past few years, and Microsoft is working on its own cloud-based solutions as well.

10. Start learning the high-end, metric system measurements of storage.
If you thought you knew the difference between gigabytes and terabytes, start boning up on the higher end of the metric scale. SAN management vendor DataCore Software Corp. now supports arrays that can contain up to a petabyte — a thousand terabytes — of data. Savvis sells 50GB increments of its SAN utility storage to its co-location customers, which Doerr says has been very well received. “It’s for customers that don’t want to run their own SANs or just want to run the compute-selected functions,” he states. “There’s a lot of variation across our customers. You have to be flexible if you want to win their business.” Given that it wasn’t too long ago when no one could purchase a 50GB hard drive, he says this shows that, “we’re going to be talking exabytes when it comes to describing our storage needs before too long.” Next up: zettabytes and yottabytes.

Source: Redmondmag.com

Virtual Servers, Real Growth

2017-07-27T00:01:09+00:00 July 12th, 2010|Uncategorized|

 

If you follow tech industry trends, you’ve probably heard of cloud computing, an increasingly popular approach of delivering technology resources over the Internet rather than from on-site computer systems.

Chances are, you’re less familiar with virtualization — the obscure software that makes it all possible.

The concept is simple: rather than having computers run a single business application — and sit idle most of the time — virtualization software divides a system into several “virtual” machines, all running software in parallel.

The technology not only squeezes more work out of each computer, but makes large systems much more flexible, letting data-center techies easily deploy computing horsepower where it’s needed at a moment’s notice.

The approach cuts costs, reducing the amount of hardware, space and energy needed to power up large data centers. Maintaining these flexible systems is easier, too, because managing software and hardware centrally requires less tech support.

The benefits of virtualization have made cloud computing an economical alternative to traditional data centers.

“Without virtualization, there is no cloud,” said Charles King, principal analyst of Pund-IT.

That’s transforming the technology industry and boosting the fortunes of virtualization pioneers such as VMware (NYSE:VMW – News), Citrix Systems (NMS:CTXS), two of the best-performing stocks in IBD’s specialty enterprise software group. As of Friday, the group ranked No. 24 among IBD’s 197 Industry Groups, up from No. 121 three months ago.

1. Business

Specialty enterprise software represents a small but fast-growing segment of the overall software enterprise market, which according to market research firm Gartner is set to hit $229 billion this year.

As with most software, the segment is a high-margin business. With high upfront development costs but negligible manufacturing and distribution expenses, specialty software companies strive for mass-market appeal. Once developers recoup their initial development costs, additional sales represent pure profit.

Software developers also make money helping customers install and run their software, another high-margin business.

But competition is fierce. Unlike capital-intensive businesses, software companies require no factory, heavy equipment, storefront or inventory to launch. Low barriers to entry mean a constant stream of new competitors looking to out-innovate incumbents.

In addition to the virtualization firms, notable names in the group include CA Technologies (NMS:CA) and Compuware (NMS:CPWR).

All offer infrastructure software to manage data centers.

“Big-iron” mainframe computers began using virtualization in the 1970s, around the time when CA and Compuware were founded.

In the late 1990s, VMware brought the technology to low-cost systems running ordinary Intel (NMS:INTC) chips. VMware has since emerged as the dominant player in virtualization.

Citrix has added a twist to the concept, virtualizing desktop computers. Rather than installing workers’ operating system and applications on hundreds of PCs spread across the globe, companies can use the technology to run PCs from a bank of central servers. Workers, who access their virtual PCs over the Internet, don’t know the difference.

Microsoft (NMS:MSFT) has jumped in with its own virtualization product, HyperV, which it bundles free into Windows Server software packages. Oracle (NMS:ORCL) and Red Hat (NYSE:RHT – News) have launched virtualization products as well.

Meanwhile, CA and Compuware are racing to move beyond their mainframe roots to support virtualization and cloud-computing-enabled data centers. In February, CA said it would buy 3Tera to build services and deploy applications aimed at the cloud-computing market.

And Compuware bought privately held Gomez, Inc. last fall to manage cloud application performance.

Name Of The Game: Innovate. With a fast-moving market and steady influx of new competitors, keeping customers happy with good service and money-saving breakthroughs is vital.

2. Market

Nearly everyone who runs a corporate computer system is a potential buyer of virtualization software. Companies ramping up their information-technology purchases use the software to manage their sprawling infrastructure; others with limited budgets use it to squeeze more out of their existing systems.

Sales of server-virtualization software are set to grow 14% this year to $1.28 billion, according to a report by Lazard Capital Markets. Sales of software to manage virtual environments will grow 44% in 2010 to $1.88 billion.

Desktop virtualization revenue will rise 184% this year to $847.8 million. Citrix has the edge in this budding market with its XenDesktop product.

VMware is dominant among large enterprises, controlling about 85% of the server virtualization market. Microsoft is favored by small and midsize companies.

Virtualization is seen as “a strategic asset” for enabling cloud computing, and continues to gain momentum, says Lazard analyst Joel Fishbein.

VMware has the early-mover advantage in this market with its vSphere platform and has stayed ahead by adding new features such as data security and disaster recovery, analysts say.

But Citrix is partnering closely with Microsoft to take on VMware in virtualization.

3. Climate

Competition is heating up as companies scramble to adopt virtualization. Before 2009, just 30% of companies used virtualization, says analyst Fishbein. This year, that will double to 60%. Most of the gain is coming from small and midsize customers.

In addition, virtual servers are soon expected to more than double as a percentage of the overall server workload, from 18% today to 48% by 2012.

VMware says it can stay a step ahead of the pack by building new features into its products, says Dan Chu, VMware’s vice president of cloud infrastructure and services.

“We have a large technology lead with what we enable for our customers,” Chu said. “We are several years ahead of what the others are doing.”

Citrix CEO Mark Templeton says his firm’s broadening strategy — offering a variety of products with multiple licensing options and distribution channels — will grow sales.

“What’s going on is a massive shift in how computing gets delivered,” Templeton said. “In an environment that’s changing so dramatically, the highest-risk thing you can do is not act.”

4. Technology

The first virtualization boom stemmed from a shift over the last decade away from big expensive mainframes and minicomputers to massive banks of cheap Intel-powered machines. Virtualization gave these low-cost systems some of the high-end features of their pricier counterparts.

Virtualization software makers are betting on a second wave of growth fueled by the industrywide shift to cloud computing.

Technology managers use virtualization to run cloud computing in their own data centers. And large tech vendors such as Microsoft use the technology for cloud-computing services they sell to customers.

Dividing computers into isolated virtual machines gives cloud service providers the benefits of shared computing resources without the security downsides.

VMware has the early lead in virtualization. But the technology is quickly becoming a commodity as Microsoft and others bundle it into their broader platforms.

“VMware is known as a virtualization company, and Microsoft is a platform company,” said David Greschler, who heads up Microsoft’s virtualization efforts. “Their strategy is to sell virtualization, but our strategy is to make virtualization available as part of a larger platform at no extra cost.”

At the same time, a shift toward a world of cloud-computing services hosted by the likes of Microsoft, Amazon.com (NMS:AMZN) and Google (NMS:GOOG) could lead to fewer companies purchasing virtualization software themselves.

Source: Investor’s Business Daily

Cloud Computing with Wyse

2017-07-27T00:01:11+00:00 June 8th, 2010|Uncategorized|

Cloud Computing involves using information technology as a service over the network.

  • Services with an API accessible over the Internet
  • Using compute and storage resources as a service
  • Built on the notion of efficiency above all
  • Using your own datacenter servers, or renting someone else’s in granular increments, or a combination

We at Wyse believe cloud computing has the potential to change how we invent, develop, deploy, scale, update, maintain, and pay for applications and the infrastructure on which they run.

Cloud Computing Benefits

Efficiency

Drives cost out of the delivery of services, eliminating capital expense in favor of more easily managed operating expense

Agility

Increases speed and agility in deploying services, adapting to seasonal or cyclical computing needs

Speed

Shortens implementation cycle time

Flexibility

With application deployment decoupled from server deployment, applications can be deployed and scaled rapidly, without having to procure physical servers

Ubiquity

Applications can be made available anywhere, any time

Cost avoidance

Minimizes the risk of deploying physical infrastructure, lowering the cost of entry, thin devices enable Green IT

Accelerated innovation

Reduces run time and response time, increasing the pace of innovation

Users

Task Workers

Call Centers addressing the trend to move employees off campus and into home
Challenge: PCs are virus prone, difficult to manage, and at risk of data being exposed or devices being stolen. Tried terminal services and dissatisfied with user experience
Require flexible seating, reduction in energy, and simplified management and deployment

Financial Services

Branch systems with varying classes of users across broad geographies attracted to client vitalization for security and compliance.
Challenges: VDI pilots proving difficult to scale, user experience across branch system is inconsistent, require higher levels of user functionality for banking peripherals.

Healthcare

Hospitals addressing the need for EMR and access to common databases and information
Challenge: Healthcare environment, ease of secure access to up to date EMR, access for field professionals. Risk of theft of traditional PC devices, and the liability of data loss and exposure, universal access throughout the primary and satellite facilities

Education

Schools need to provide common IT functionality to faculty, staff and students
Challenge: shrinking budgets, limited IT administrative support, ageing computing devices, providing lesson and status access to parents, teachers and students from home

Engineering

Technology companies needing to provide secure, access devices to on shore, off shore, and outsourced engineers and developers.
Challenge: Requirements to deliver productivity applications while addressing performance demands of engineers and developers

Simple EndPoint Benefits

Security/Privacy

No HDD prevents data from being stored on the client, improving data security. All data stays on the server / cloud, enhancing privacy enforcement. Devices can be virus proof, removing a security concern for the endpoint component of the environment.

Compliance

HIPPA in Healthcare, Basil-II in banking, and Sarbanes Oxley regulations all require data to be protected and centralized. Thin endpoints enforce this requirement, easing compliance.

Manageability

Lowest TCO is accomplished when all endpoints appear similarly to the server. TCs, TPCs, and Thin mobile devices (handheld and notebook size) are all centrally managed, and look the same to servers. Like the Southwest Airlines strategy – all 737-300s – easier to manage.

Reliability

Thin clients are far simpler in design than traditional PCs, and deliver far greater reliability. Measured in terms of MTBF (Mean time between failure), PCs offer 30 – 40K hours, but TCs deliver far better ratings at 80 – 375K hours.

Rapid Deployment

No imaging requirements on most thin devices make TCs deployable in minutes, not hours as with most PCs. TCs go from carton to desktop to productivity in minutes. No software to load, little configuration needed.

Power, Noise, Cooling

TCs use a fraction (less than 10 percent) of the energy needed by traditional PCs. No fans or moving parts in TCs eliminate noise, and reduce AC requirements in the work areas.

E-Medical Records: 10 Steps To Take Now

2010-03-11T14:51:52+00:00 March 11th, 2010|Uncategorized|

Don’t wait for the government to finalize meaningful use requirements. Here’s how to jump-start your health IT efforts.

The federal government’s $20 billion-plus healthcare IT stimulus program has more hospitals and doctors than ever planning to implement e-medical record and other health IT systems. But many healthcare providers have put plans on hold as they wait for the government’s final “meaningful use” rules that will determine which types of systems are eligible for reimbursements.

“I’ve been in this industry for 25 years, and I’ve never seen as much anxiety and confusion,” said Dr. Mark Leavitt, chairman of the Certification Commission for Health IT. Leavitt spoke with Informationweek at the Healthcare Information Management Systems Society (HIMSS ) conference in Atlanta Tuesday.

Despite all the uncertainty, there are steps providers can take now that will help them jump-start system deployments once the final rules are issued later this spring. Here are 10 top ones:

1) Get buy-in and sponsorship from your organization’s top leadership, including influential clinicians and the CEO. “Solicit your leadership team and actively communicate with upper management,” said Curt Kwak, CIO of the western region of Providence Health & Services, a provider that serves Washington, Oregon, Montana, California, and Alaska.

Support from the top is critical, especially when convincing users to give up old work habit and processes. Make sure everyone understands your goals, such as how the new systems will improve quality of care.

2) Decide how you’ll fund the project–remember stimulus dollars don’t start flowing until 2011. Some EMR vendors are offering interest-free loans for the upfront costs related to the purchase of these systems. Also consider applying for federal, state, and private grants. And some hospitals are offering free EMR software to doctors under the relaxed federal Stark rules.

3) Start evaluating your workflow and processes. Figure out what steps you’re doing now waste time and money, and can be eliminated with the new system. “Health IT is truly a magnifying glass, you’ll see all your flaws,” said Florence Chang, senior VP and CIO at MultiCare, a Tacoma, Wash., hospital network. “Decide what steps don’t add value.”

4) Find out where key information resides in your organization. For instance, is information on patients’ allergies in paper charts or computerized files? Start collecting information on how many prescription drug orders your doctors put through, and how they do those orders–paper, fax, or phone-in. You’ll need this data later to measure your organization’s meaningful use of electronic ordering, said Mike Wilson, senior IT director of clinical systems at Compuware.

5) Look at EMR and other health IT products for the ones that fit your organization’s needs. Consider products that have a good shot at attaining meaningful use certification, like those already approved by the Certification Commission for Health IT, or software from vendors that are offering meaningful use compliance guarantees.

6) If you’re not ready for a big bang approach to EMRs, consider modular software and components that let you add functionality in increments. “Look at the entire puzzle for what pieces fit now, and what can fit later,” Providence Health & Services’ CIO Kwak said.

7) Determine whether you have the resources and staff to handle an on-site system–both to implement it and keep it running. If not, then maybe a hosted model makes more sense. If you need to recruit talent, figure out the skills you’ll need and get going.

8. Get your infrastructure ready to deal with new systems. For instance, can it handle computerized physician order entry? If not, what foundation can you start laying, said Avery Cloud, VP and CIO of New Hanover Health Network, a health care organization in Wilmington, N.C.

9) If you were already planning or implementing health IT systems prior to the HITECH legislation passing in February 2009, don’t change things now. Don’t divert your original plans because meaningful use deadlines are compressing the timeframe, said Kwak.

10) Finally, don’t jump into poorly thought out health IT plans just to try getting the stimulus rewards. “Don’t do it just for the money,” said Wilson. “It’s like having a baby just for the tax break.”

Source: By Marianne Kolbasuk McGee,  InformationWeek
March 3, 2010
URL: http://www.informationweek.com/story/showArticle.jhtml?articleID=223101301

Endpoint Virtualization for Healthcare Providers

2017-07-27T00:01:12+00:00 January 15th, 2010|Uncategorized|

It’s one of the more vexing challenges in healthcare.CB051669
Every day, doctors, nurses, case managers, and other hospital workers need quick and reliable access to key applications. And because they’re continually on the move, they need to be able to go to any workstation or kiosk to call up a particular application. But all too often they can’t get access because of problems inherent in the delivery of specific and proprietary healthcare applications and complexities managing the client system environment.
What if applications, and even the entire desktop, were able to follow these roaming users and be accessed from virtually any device? What if there was a much easier way for users to work in an increasingly digital environment, where Computerized Physician Order Entry (CPOE) and Electronic Medical Records (EMR) are becoming commonplace?
This article looks at how centralized data management and endpoint virtualization can help physicians and clinicians, as well as IT staff, work more productively and securely.

The frustration factor

The access challenges that physicians and clinicians routinely face today can be daunting, to say the least.

  • Password problems: It’s easy to forget which passwords to use for which applications, and when to reset passwords. Calling the helpdesk for assistance can take up valuable time.
  • Application access and printing confusion: When using another workstation, or returning to the kiosk they were using earlier in the day, doctors and other users have to find the right application and navigate back to the place where they left off. This can be frustrating and time-consuming, particularly if the user moved to a different workstation that has a different user interface. Printing can also turn into a hassle for roaming users. They may not know which printer is used by a particular workstation. Or a printer may not be located nearby.
  • Remote access issues: When working remotely, users may not be able to reliably connect to the network and access the applications they need. And when they connect, the desktop may be different from what it is at the hospital. Even more frustrating, remote connections are often unreliable, dropping users in mid-session.
  • Inability to use computing resources: Some guest users, such as candy stripers and vendors, can’t use computing resources for basic functions because they aren’t authorized for the corporate network.

The IT challenges

Now let’s look at access from the point of view of the IT department. Since hospitals never close, IT has to ensure continual, reliable access every hour of every day. And there’s no shortage of challenges in making that happen:

  • Desktop management: Clinicians often share workstations in a kiosk-like fashion, and it’s not unusual for a single workstation to be used by dozens of people in a single day. Many times, hospital workers also need to access applications and patient data from different client devices. To enable device-to-device roaming and kiosk capabilities, IT must apply the highest-common denominator to every workstation. This means setting up and maintaining each workstation with all of the applications users might need, and making sure each workstation has the computing power to handle all of these applications. That’s not an efficient use of resources.
  • An inundated help desk: When a user doesn’t know how to find the local printer, it means another call to the help desk. And when users can’t remember their passwords or haven’t reset them, the help desk has to walk them through the process. To save time, people end up using other workers’ passwords instead of contacting the helpdesk. Shared passwords not only violate HIPAA mandates, they also hinder identity management initiatives.
  • Remote access issues: Enabling remote access is a must for most healthcare facilities, but addressing VPN connectivity issues can become a time-consuming chore for the IT staff.

The promise of centralized management

Symantec believes that many of these challenges can be addressed by taking a centralized approach to the management of data, which makes information more easily accessible to both healthcare providers and IT personnel. With a centralized management approach, care providers in different geographical locations can access the same applications and information simultaneously no matter where they are. This increases efficiency and productivity, while enabling providers to respond more quickly and improve quality of care.

By employing centralized management, hospitals can reduce IT costs and response times while increasing user satisfaction and security. Password management is easier, and password security is actually increased, for example, as is reporting and auditing for regulatory compliance issues. Access to patient information does not rely on the availability of a single workstation. When a particular endpoint becomes unavailable, the information remains accessible elsewhere.
Centralization also strengthens data security procedures for healthcare providers and networks. Hospitals typically use an open architecture in which users who are not employed by the hospital are constantly entering and leaving the environment. Although each endpoint may have security measures installed, the responsibility for updating and maintaining those measures today lies with the owner of the endpoint. Central management of data and applications strengthens this model by ensuring protection regardless of any security measures implemented on endpoints.

The promise of endpoint virtualization

There is an additional technology solution that can streamline the way healthcare organizations provide access to key applications: endpoint virtualization. While many organizations are already familiar with server virtualization, endpoint virtualization may be a new concept for them.

Endpoint virtualization in this context refers to the ability to provide a portable computing experience across a broad range of computing environments. The promise of endpoint virtualization lies in improving the end-user experience while helping to lower the cost of managing endpoint devices.
For clinicians, endpoint virtualization offers access to the user’s personalized workspace (desktop and applications) from any device (networked or remote) via a single authentication method. If physicians are able to authenticate to a network with a single sign on, rather than authenticating from each endpoint they use throughout the day, they can access applications from any networked or remote device.
Endpoint virtualization supports a clinical work environment by allowing the shared use of devices through rapid desktop switching and the ability to roam from one device to another while maintaining the active state of the desktop. Users can print locally even when roaming, eliminating the hassle of tracking down printers. And for physicians working remotely, their personalized workspace looks and acts exactly as it does when they’re in the hospital.
For IT professionals, endpoint virtualization enables IT to centrally manage all users, workstations, and applications, simplifying IT efforts to provision applications and updates to users. By intelligently allocating computing resources based on user class profiles, IT can optimize these resources. IT staff no longer need to apply the “highest common denominator” to every workstation.
In addition, by centralizing control of password management and enabling single sign on, IT can more quickly and easily resolve password issues when they arise.
The bottom line: IT can reduce costs and response times while maintaining a high level of user satisfaction and security.

Symantec and endpoint virtualization

Thanks to its extensive portfolio for managing virtual workspaces and providing a portable computing experience, Symantec can help organizations better secure and manage their endpoint data and applications. Symantec’s strategy is to help enable a truly dynamic endpoint, where applications and information are delivered to any computing environment in a seamless manner.

As hospitals continue to automate and add applications, providing convenient access to these applications for physicians and clinicians while maintaining security and patient data privacy will prove to be a challenge. But increasingly, hospitals will discover that centralized data management and endpoint virtualization can help address these issues.

Key Considerations for Hospitals Making the Move to EHRs

2017-07-27T00:01:12+00:00 October 27th, 2009|Uncategorized|

Will the economic stimulus package signed into law by President Obama in February accelerate the broad adoption of electronic health records? The expected impact of the bill has been widely discussed since it was enacted.
 
According to a study that appeared last year in the New England Journal of Medicine, only 17% of office-based physicians are using some sort of EHR. Hospitals have also been slow to go electronic. Another study appearing in the NEJM found that just 1.5% of non-federal hospitals have a comprehensive EHR system across all clinical units.
 
So while many hospitals and physicians have taken initial steps with automation, they have yet to adopt comprehensive systems. High costs, the difficulty of changing the clinical culture from a paper-based workflow, and the current economic downturn (resulting in reduced budgets, layoffs, a drop in patients, and difficulties in getting credit) have all impeded caregivers’ ability to invest in new systems.
 
But the reluctance to embrace EHRs could dissolve soon as a result of the stimulus package and healthcare reform, observers say.
 
The $787 billion package, officially known as the American Recovery and Reinvestment Act (ARRA), sets aside $17 billion in direct incentive payments to physicians and hospitals that adopt EHRs, plus significant indirect funds to enable adoption and remove technology barriers. Some analysts put the complete number as high as $36 billion. Efforts to reform healthcare have focused on improving the quality of patient care and reducing costs through information technology.
 
This article looks at some of the special challenges hospitals face as they make the move to electronic health records.
 
The bottom line: Do more with less
Under the new law, a hospital that adopts certified EHR technology will be rewarded with increased Medicare or Medicaid reimbursements. Incentive payments will go to hospitals that demonstrate “meaningful use” of certified EHR technology during each incentive payment year starting in 2011. To encourage hospitals to adopt EHR technology early, the total possible amount of incentive payments will decrease the longer a hospital waits to become an EHR user and eventually will turn into “penalties” by ways of decreasing Medicare reimbursement..
 
To qualify as a meaningful user of EHR technology, a hospital must demonstrate that its EHR technology enables it to prescribe electronically, exchange data with other providers, and generate reports on how it performs on certain “clinical quality measures.” Also, the EHR system in use must be certified. (These are the proposed criteria, which have yet to be finalized.)
 
The federal government estimates that the conversion to digital records will save $12 billion in healthcare spending over 10 years, which presumably would result in lower Medicare and Medicaid outlays, as well as positive impacts for employers and citizens alike.
 
But a new study finds that the current economic conditions are making it difficult for hospitals to adopt “meaningful” EHR systems.
 
According to the PricewaterhouseCoopers Health Research Institute report, “Rock and a Hard Place: An Analysis of the $36 Billion Impact From Health IT Stimulus Funding,” healthcare providers are struggling to find money to implement EHR systems because the economic recession has depleted capital resources and forced them to make cuts in their IT budgets.
 
A separate PricewaterhouseCoopers survey of 100 hospital CIOs found that 82% of hospitals already have cut their IT budgets by an average of 10%, while 10% have cut their budgets by more than 30%. Two-thirds of the CIOs surveyed said they anticipate making additional cuts in IT spending by the end of this year.
 
In such an environment, it is clearly difficult to commit to large projects like the implementation of an EHR. For many hospitals, it’s a matter of having to do more with less. Hospitals need to be efficient in their IT systems to reduce costs so that they can invest in clinical automation and EHRs, and support them cost-effectively.
 
More stringent privacy and security requirements
Under the health IT provisions of the stimulus package, all entities that handle protected health information must comply with HIPAA (Health Insurance Portability and Accountability Act) security and privacy regulations. Under the new Breach Notification for Unsecured Protected Health Information; Interim Final Rule, which becomes effective September 23, the stimulus law also calls for health care providers to:
  • Notify all affected patients within 60 days of a security breach
  • Report security breaches to the HHS secretary and prominent local media outlets if the incident affects more than 500 individuals
  • Track all personal health information disclosures
  • Upon patient request, provide an account of every disclosure for the previous three years
In addition, the new law expands HIPAA regulations to business associates, tightens rules on when patient information can be used for marketing, increases penalties for noncompliance, and enables significantly more aggressive enforcement.

Today’s distributed business environment

The push for more widespread adoption of EHRs comes at a time when the requirements for a secure infrastructure are more challenging then ever, especially given today’s distributed business environment. Increasingly, hospitals’ IT networks are connected to clinics, physician remote offices, remote contractors, suppliers, university networks, and other external parties. At the same time, managed and unmanaged endpoints, including laptops and other mobile devices inside and outside the hospital, are proliferating. As a result, security perimeters must expand beyond the internal network to numerous critical endpoints.

 
In this constantly evolving environment, traditional security measures alone, such as firewalls, antivirus, and intrusion detection systems/intrusion prevention systems, are no longer sufficient.
 
Hospitals must also comply with multiple standards and regulations regarding patient data privacy, including those issued by The Joint Commission, HIPAA, and individual states. As a result, they are implementing methods to monitor and report access to critical systems and information.
 
Security best practices
Symantec recommends that hospitals adopt a comprehensive and automated enterprise security plan, beginning with the creation of a roadmap that includes best practices such as:
  • Performing comprehensive risk assessments
  • Identifying critical endpoints based on criticality of uptime, importance to business processes, and susceptibility to a security or privacy incident
  • Defining cost-effective measures to secure critical endpoints, including mobile devices and databases, and minimize data leakage
  • Implementing automation for ongoing measurement of existing security effectiveness, adherence to security policies, and regulatory compliance
  • Implementing automation for monitoring, quickly identifying and responding to policy violations, and reporting on security and privacy on multiple levels—from executive dashboards to detailed reports for IT staff
  • Protecting sensitive patient information from breaches by implementing data loss prevention

Managing storage complexity

As can be imagined, the adoption of EHRs also has profound implications for hospitals’ storage systems. EHRs summarize and organize patient information, including digitized images of scanned paper documents and electronic data from patients, payers, and pharmacies. They can contain vast amounts of form-based information that must be copied into backup and disaster recovery versions. 

Managing storage complexity

As can be imagined, the adoption of EHRs also has profound implications for hospitals’ storage systems. EHRs summarize and organize patient information, including digitized images of scanned paper documents and electronic data from patients, payers, and pharmacies. They can contain vast amounts of form-based information that must be copied into backup and disaster recovery versions.

 
For many hospitals, storage demands are already growing more than 70% each year, and current data storage systems aren’t scalable to meet the demands of exponentially increasing amounts of retained data.
 
The rapidly growing storage in hospitals translates to more IT staff resources required to manage it, and the demand is especially burdensome due to the use of disparate storage systems that are based on different technology platforms and have to be managed individually.
 
Without an enterprise-wide storage strategy, providers are continuing to purchase and deploy additional storage islands—each of which requires even more individual management. Implementation of a solution that automates and centralizes the management of stored data using a single interface would maximize the utilization of these various storage systems to accommodate growing amounts of data, thereby reducing costs for purchasing additional storage hardware and relieving demands on IT.
 
Conclusion
Dr. David Blumenthal, the Obama Administration’s National Coordinator for Health Information Technology, said recently that electronic technology will soon be considered “as fundamental to medicine as the stethoscope.” Federal incentives for the meaningful use of such technology, he added, will propel the nation.
 
As hospitals and their IT departments increasingly apply automation to improve patient care quality, attract and retain talent, and reduce costs, traditional IT infrastructures are being pushed to the limit. Factor in the budget cuts brought about by the economic recession, and it’s clear that many hospitals find themselves having to do more with less.
 
Symantec can help healthcare providers attain their EHR goals by delivering best practices and industry-leading products and services for security, storage management, and compliance. To learn how the Symantec Healthcare Provider Solution addresses these critical IT issues, go to the Symantec Healthcare website.
 
Related Link

Customer Success Story

Source: Symantec.com

Citrix Offers Trade-up to XenDesktop 4 Program

2017-07-27T00:01:12+00:00 October 12th, 2009|Uncategorized|

A unique opportunity for customers to add desktop virtualization to their proven XenApp implementation and save up to 80%.

The Trade-up to XenDesktop 4 Program gives XenApp customers a simple and cost-effective path to the most complete desktop virtualization solution— XenDesktop 4. With XenDesktop 4, customers can do everything they do today with XenApp and deliver high-definition virtual desktops to every user across the enterprise.

Customers who trade-up all their licenses at once can receive two XenDesktop 4 user licenses for every one XenApp license they trade-up. This 2-for-1 offer gives customers an 80% savings compared to buying new desktop virtualization licenses. The Program enables customers to leverage their existing investments to add desktop virtualization now, for an unbeatable price. 

Program options:

  • Trade-up 100% of your active XenApp licenses:  
    • Get 2 XenDesktop 4 licenses for each XenApp license
    • Save up to 80%
  • Trade-up a sub-set of your active XenApp licenses:
    • Get 1 XenDesktop 4 license for each XenApp license
    • Save up to 70%
  • Trade-up any XenApp license without Subscription Advantage:
    • Get 1 XenDesktop 4 license for each XenApp license
    • Save up to 50%

Leverage existing investments

  • Trade-up XenApp licenses to get XenDesktop 4 at a fraction of the cost
  • Use XenApp skills and best practices to simplify desktop virtualization adoption
  • Leverage existing hypervisor, storage and Microsoft infrastructures  

Add desktop virtualization

  • Continue to use XenApp functionality as you always have
  • Add the most comprehensive set of virtual desktop technologies
  • Deliver virtual applications and desktops as on-demand services to any user

 Save on XenDesktop 4

  • Trade-up all active XenApp licenses at once, get 2X the XenDesktop 4 licenses
  • Trade-up your XenApp licenses, save between 50%-80% on XenDesktop 4
  • Trade-up includes 12 months of Subscription Advantage on XenDesktop 4

 Act now. This program is only valid until June 30, 2010! 

Citrix XenDesktop 4 – The Virtual desktop revolution is here…..for everyone

XenDesktop 4 includes the key features fundamental to extending the benefits of virtual desktops to every user in your organization.

Any device, anytime, anywhere

Today’s digital workforce demands the flexibility to work from anywhere at any time using any device they’d like. Leveraging Citrix Receiver as a universal client, XenDesktop 4 users can access their virtual desktops and corporate applications from any PC, Mac, thin client or smartphone.

HDX™ user experience

XenDesktop 4 delivers an HDX™ user experience on any device, over any network, with better reliability and higher availability than a traditional PC. With Citrix HDX™ technology, users get an experience that rivals a local PC, even when using multimedia, real-time collaboration, USB peripherals, and 3D graphics. XenDesktop 4 offers the best Flash multimedia performance while using 90% less bandwidth compared to alternative solutions.

Citrix FlexCast™ delivery technology

Different types of workers across the enterprise need different types of desktops. XenDesktop 4 can meet all these requirements in a single solution with our unique Citrix FlexCast™ delivery technology. With FlexCast™, IT can deliver every type of virtual desktop – each specifically tailored to meet the performance, security and flexibility requirements of each individual user.

On-demand apps by XenApp™

To reduce desktop management costs, XenDesktop 4 offers the full range of Citrix application virtualization technologies with on-demand apps by XenApp™. With application virtualization, IT can control data access, manage fewer desktop im-ages, eliminate system conflicts, and reduce application regression testing.

Open architecture

XenDesktop 4 works with your existing hypervisor, storage and Microsoft infrastructures, enabling you to leverage your current investments – while providing the flexibility to add or change to alternatives in the future. Whether you use XenServer, Microsoft Hyper-V or VMware ESX or vSphere, XenDesktop supports them all and simplifies management of networked storage using StorageLink™ technology.

Load More Posts