Windows 8 Wireless Connections in the Enterprise

2017-07-27T00:01:07+00:00 October 10th, 2012|Uncategorized|

I’ve been having issues attaching to WiFi networks with Windows 8 lately.  Not residential Access Points, but commercial controllers.  I did some searching, and found this:
 
Windows 8 clients may not be able to connect to wireless network
 
Ah…  So Windows 8 natively supports 802.11w, but cannot connect to one of the largest enterprise network footprints in production today.  Hmm.  Apparently, all Cisco controllers need a firmware update before anyone can connect new Windows 8 computers to the wireless network… 
 
Or… 
 
…You can back-rev your driver.  Well, have you dealt with enterprise network people?  Which do you think is more likely to happen? 
😉
 
So, here you go; a little instructional video to help you down-grade your shiney-new Windows 8 Wireless network driver to the Windows 7 version, so that you can play in the corporate sandbox.  In this slideshow, Paul demonstrates the problem (utilizing Windows 8’s great new automatic screen capture feature)…
 
(Note: To easily get to the menu for Device Manager – which is called the “Windows desktop and Administrative tools” menu use “Winkey + X” from the Windows classic desktop.)
 

 
We hope it helps!
 
(…with contributions from Paul Opper and Jeremy Pavlov)

 

How to script “Ownership” of NTFS File Systems…

2017-07-27T00:01:07+00:00 July 25th, 2012|Uncategorized|

There’s a time in every IT professional’s life where he or she will need to “Take Ownership” of files and folders that reside on an NTFS File Server (or in larger cases with hundreds or thousands of servers) in a Windows Server 2008 R2 or Windows 7 environment.  I’m sure most IT professionals already know how to do this in the Windows Explorer GUI … but what if your task at hand required that you script this process to run during a limited window of time given during a server migration, and you had to minimize the amount of “clicks” as well as the amount of time spent on multiple servers?  

I was recently assigned to a project very similar to the scenario described above; and after a little research, I stumbled upon a little-known Microsoft tool called ‘takeown.exe’ that has been shipping with Microsoft Server products since Windows Server 2003.  Within minutes of discovering ‘takeown.exe’ I had a script written and I was running it in my test environment with positive results.  This shows how simple the tool really is!

Below is the usage example as seen from the command line ‘takeown.exe /?’:

TAKEOWN 
[/S system [/U username [/P
[password]]]] /F filename [/A] [/R [/D prompt]]

Below is my personally recommended example: 

TAKEOWN.exe /F C:MyFolder /R /A

 

As expressed above, my suggestions are to use the /F (to specify the folder to apply ownership on), the /R switch (as in “recursive” which mean to apply to all child objects, sub-folders and files) and the /A switch (which gives ownership to the “Administrators” group instead of the currently logged in user).  And while I didn’t use the /D switch in the above example, it may be necessary to use the “/D Y” to avoid being prompted in cases where the user ID running the command does not have rights to list the folders.   

You can also reference additional parameters by typing in ‘takeown.exe /?’ from the command prompt on any Windows Server 2008 R2 server or Windows 7 machine.

 

The Modify or Alter Column Statements, and MS SQL 2008…

2017-07-27T00:01:07+00:00 June 13th, 2012|Uncategorized|

(…with technical guidance from Avi Moskovitz…)

Recently, I needed to change the width of a column in the SQL Server portion of one of our Databases using MS Management Studio 2008.  And while this meant changing data type, you’d think it would be as easy as simply changing the properties within the table in the same manner that you can within Access 2010; normally, less than one minute to get ‘er done.  And in organizations where it is permissible to un-check the “Prevent saving changes that require table re-creation” on production systems, this might well be the case  (as in the graphic below).

 

The "Prevent saving changes that require table re-creation" option

The "Prevent saving changes that require table re-creation" option

 

But, in my situation, I had to find another route; this was going to require scripting…

1 – A Wild Goose Chase

A few quick searches of things like, “How do you change the width of a column in SQL”, and the results put a big grin across my face — I found reference to a MODIFY statement with an example of a column width change:

ALTER TABLE my_table MODIFY this_column VARCHAR2(50);

My problem was solved!  Or, so I thought.  After trying a few times and getting error messages, I almost started to suspect a conspiracy — or at least a concerted effort was in place to confuse and befuddle…

Luckily, Coretek is staffed with experts on a wide variety of software and hardware platforms.  I reached out to our resident SQL guru, Avi Moskovitz, who informed me the “solution” I found referencing the  MODIFY statement WAS accurate – if used in an Oracle environment – but MS SQL 2008 does not support it. 

2 – Drastic, Dangerous, but Legitimate

One possible option is to create a new column and delete the old column; something you should not be in a hurry to do if you have data in the column as you need to think about how the data will be repopulated within the column (will it affect links, relationships, etc., or will it be truncated or corrupted?).  I have found that creating a new column in the SQL backend (with the correct parameters) and then going to the front end (if you happen to have something like an Access front-end) to copy the information from the old column and pasting into the new column will work; but beware of deleting a key or a linked field. 

3 – More Drastic, Less Desirable

Another, more drastic and less desirable option is to re-create the entire table in the SQL backend with the correct parameters.  This is a multi-step process that, depending on the size of the table, can take 5 to 30 minutes (or more, if there are hiccups along the way).  In doing this, you are effectively manually re-creating what SQL does when changing a data type for a field.  Not necessarily recommended, but I am keeping these steps provided by Avi in case I ever need them:

  1. Create a temporary table which will host your data (let’s call it MyTempTable).
  2. Copy the data from the original table (MyFirstTable) to MyTempTable, making sure that you set Identity_Insert “ON” so that it keeps the Key Field intact when the data gets copied in.
  3. Delete MyFirstTable. 
  4. Recreate the original table (a duplicate if you will of MyFirstTable…MySecondTable)
  5. Copy the data from the MyTempTable to the New MySecondTable making sure that you set Identity_Insert “ON” so that it keeps the Key Field intact when the data gets copied in.

4 – There’s a Right Way

But perhaps the best way of all, is to use a variant of the first thing mentioned above, correctly formatted for MS SQL 2008 with ALTER COLUMN instead of MODIFY, as follows:

 ALTER TABLE my_table ALTER COLUMN this_column NVARCHAR(50);

Execute this command into the query analyzer, and in an instant the problem is solved; the data type is changed (as in the graphic below)! 

SQL ALTER COLUMN Command - Successful!

SQL ALTER COLUMN Command – Successful!

Fortunately, the query analyzer will fail to execute if it detect data that does not fit the new type.  So what’s the lesson?  Internet searches are not a fool-proof way to explore scripting options; without the experience to understand the ramifications, the options can sometimes do as much harm as good…

 

Finding Rogue KMS Servers in the Enterprise…

2017-07-27T00:01:08+00:00 February 8th, 2012|Uncategorized|

In larger Enterprises with Microsoft-based infrastructure, it’s highly likely that the licensing for the Windows 7 workstations will be based on the Microsoft KMS model.  If you don’t already know, this means you run servers in-house that register themselves into DNS as license providers, and Windows clients will learn of them (and become affiliated with them) to get a license, rather than contacting Microsoft themselves across the Internet.

Unfortunately, one problem that can occur is that someone who has access to the Microsoft license codes (like an I.T. worker, developer, etc.) might accidentally install a KMS license on a server that is not intended to be a KMS server.  And when a KMS license is installed, the server doesn’t know any better; and dutifully registers its KMS capability with the internal Active Directory based DNS as a VLMCS SRV record. 

Recently, I ran into a situation where I needed to hunt down and eliminate some accidentally rogue KMS servers that had cropped up across a large infrastructure, and be able to re-check at regular intervals.  While I originally wrote the script as a bash shell script for Linux, I re-wrote it into PowerShell recently for someone who asked, and I thought I’d post the new version here.

Mind you, this is a stripped-down version of the script, but it includes all that is needed to run the check manually for a hierarchical DNS infrastructure (although you may wish to strip out components if you just want to check the parent domain). 

Copy the contents below, paste them into a PowerShell script file (*.ps1), change the variables at the top… and have fun!

 

# Change the following 3 variables as needed.
# This script will loop through the subdomains, checking for KMS servers in each
# subdomain, and then at the parent domain.
$subs = @("subdomain1", "subdomain2", "etcetera")
$parentdomain = mydomain.local
$outfile = "checkKMS-Results.txt"
write "KMS check report..." | Out-File $outfile
write " " | Out-File $outfile -append
write "The only valid KMS servers are at the $parentdomain, as follows:" | Out-File $outfile -append
write "KMS1, KMS2, KMS3" | Out-File $outfile -append
write " " | Out-File $outfile -append
write "There should not be a KMS server at any of these locations:" | Out-File $outfile -append
foreach ($item in $subs)
{
  write "Checking subdomain: $item"
  $result = nslookup -type=srv _vlmcs._tcp.$item.$parentdomain. |findstr /C:"_vlmcs" /C:"svr hostname"
  if ("X$result" -eq "X")
  {
    write "No registered KMS server in $item" | Out-File $outfile -append
  }
  else
  {
    write "***KMS FOUND at this location: ***" | Out-File $outfile -append
    write $result | Out-File $outfile -append
  }
}
write " "  | Out-File $outfile -append
write "On the contrary, the following should be valid KMS servers:" | Out-File $outfile -append
$result = nslookup -type=srv _vlmcs._tcp.$parentdomain. |findstr /C:"_vlmcs" /C:"svr hostname"
$result | Out-File $outfile -append
write "...Done!" | Out-File $outfile -append

Enjoy!

🙂

 

SharePoint 2010 – How to move a subsite to a different location

2017-07-27T00:01:08+00:00 October 19th, 2011|Uncategorized|

I had been tasked to reorganize a Microsoft SharePoint 2010 site recently.  One of the things that I needed to do was to move a couple subsites to be under a different parent site within the same site collection.  I researched and was able to find some options, but they all entailed exporting and then importing — or backing up then restoring — the site via command line.

So, I decided to try the export/import method via the command line right on the SharePoint web front end server.  To export the site, these are the steps I tried:

  1. Fire up the command line
  2. CD ”Program FilesCommon FilesMicrosoft Sharedweb server extensions12BIN”
  3. Export the old URL using the following Command:

stsadm -o export -url http://intranet/website -filename c:testbackup.cmp

However, when I tried this under my domain and SharePoint administrator account, I got this error:

The Web application at http://intranet/website could not be found. Verify that you have typed the URL
correctly.  If the URL should be serving existing content, the system administrator may need to add a
new request URL mapping to the intended application.

After searching a bit on this, it seemed like the problem may have been a permissions issue.  So, I confirmed that the administrator account I used had SharePoint Farm Administrative rights, as well as site collection administrative rights.  I then tried to run this same command with a designated “SharePoint Administrator” account which also had SharePoint Farm Administrative rights and Site Collection Administrative rights, but then got an Access Denied error.

After spending some time adding and removing rights for both of those accounts to see if I could get this working, I table the issue and moved on to other tasks.  One of those tasks was to delete some old subsites from our SharePoint.  I ended up stumbling upon the Content and Structure link under Site Administration:

And to my surprise, I found that it is possible to select any site and copy or move it:

  1. Select the parent site of the subsite you want to move in the left navigation pane
  2. Check the box next to the subsite that you want to move in the right pane, click the Actions drop-down and click Move
  3. Then select the destination of the subsite selected in the next dialog box

This method ended up being a lot more straight forward and quicker than the command line option.

I’m glad I found it!

What’s New with Microsoft Exchange 2010?

2017-07-27T00:01:09+00:00 September 1st, 2010|Uncategorized|

Microsoft Exchange 2010 helps you achieve new levels of reliability and performance by delivering features that simplify your administration, protect your communications, and delight your users by meeting their demands for greater business mobility.  With new deployment and storage options, enhanced inbox management capabilities and e-mail archiving built-in, Exchange 2010 helps you lower costs and enhance business outcomes.

Flexible and Reliable

With Exchange, choose from on-premises deployment with Exchange Server 2010, a Microsoft hosted service with Exchange Online, or a seamless mix of both.  Microsoft’s commitment to Software plus Services ensures you can decide on your timeline for taking advantage of the flexibility and power of both without interrupting or changing your users’ experience.

Exchange 2010 offers a simplified approach to high availability and disaster recovery coupled with enhanced maintenance tools to help you achieve new levels of reliability to deliver business continuity.  Building on previous investments in Continuous Replication technologies in Exchange 2007, these investments:
  • Remove the need to deploy complex and costly clustering and third-party data replication products for full-scale Exchange redundancy
  • Automate mailbox database replication and failover with as few as two servers or across geographically dispersed datacenters
  • Maintain availability and fast recovery with up to 16 Exchange-managed replicas of each mailbox database
  • Limit user disruption during mailbox moves between e-mail servers, allowing you to perform migration and maintenance activities on your schedule, even during business hours
  • Guard against lost e-mail due to Transport Server upgrades or failures, through new built-in redundancy capabilities designed to intelligently redirect mail flow through another available route

Lowering the burden on your help desk and yourself is a key way in which you can accomplish more and reduce costs. This motivated investments in new self-service capabilities aimed at enabling users to perform common tasks without having to call the help desk. With this functionality you can:

  • Allow users to update their contact information and track delivery receipt information for e-mail messages, for example, without IT assistance
  • Offer an easy-to-use Web-based interface for common help desk tasks
  • Utilize the new Exchange Roles-based Access Control model to empower specialist users to perform specific tasks – like giving compliance officers the ability to conduct multi-mailbox searches – without requiring administrative control

 

Anywhere Access

Enhancements in the latest release of Exchange provide your users access to all of their communications from a single location while making it easier for them to collaborate with each other and their business partners.  These enhancements include the ability to:

  • Offer your users a premium Outlook experience across the desktop, Web, and mobile devices, including OWA support for browsers like Apple Safari and Mozilla Firefox
  • Unify access to e-mail, voice mail, instant messaging, and text messages enabling your users to choose the best way to communicate no matter where they are
  • Add native support for virtually every mobile device, including a premium experience with Windows Mobile, through Exchange ActiveSync
  • Share free/busy information with external business partners for fast and efficient scheduling, choosing the level of detail you wish to share

Exchange 2010 adds new productivity features that help your users easily organize and prioritize the communications in their inboxes. Your users will experience:

  • An enhanced conversation view that streamlines inbox navigation by automatically organizing message threads based on the natural conversation flow between communicating parties
  • MailTips that inform your users, before they click send, about message details that could lead to undeliverable or mis-sent e-mails, like accidentally sending confidential information to external recipients, reducing inbox clutter, extra steps, and help desk calls

With Exchange 2010, you can replace your traditional voice mail system with a unified solution integrated into the core of your communications platform. This new system will enable your users to receive their voice mail messages right in their inboxes, and manage those voice mail messages just as they do e-mail, with familiar tools like Outlook and Outlook Web Access. You will benefit from the cost-savings of voice mail systems consolidation and replacement and provide your users features like:

  • Text transcription of voice mail messages, allowing users to quickly triage messages without having to play the audio file
  • The power of a personalized auto attendant for their voice mail
  • Tools to create call answering and routing rules for individuals or groups of callers based on Caller ID and contact information ensuring that every caller gets the experience your users intend
  • Phone-based access to their whole inbox – including e-mail, calendar, and contacts – in nearly 30 languages with Outlook Voice Access

 

Protection and Compliance

Exchange 2010 delivers new, integrated e-mail archiving functionality–including granular multi-mailbox search, item-level retention policies and instant legal hold–making it easier to address compliance and discovery issues. Administrators get centralized control of all archives while users get direct access to their archived mail, including a familiar archive experience that does not disrupt the way they manage their inboxes every day. With these new features you can:

  • Easily move unwieldy Outlook Data Files (PSTs) from the PC back to Exchange for more efficient control and legal discovery
  • Simplify the classification of e-mail with new centrally definable Retention Policies that can be applied to individual e-mail messages or folders
  • Conduct cross-mailbox searches through an easy-to-use Web-based interface, or through Roles-based access control, empowering your HR or compliance officers to execute targeted searches

Exchange 2010 also expands Information Protection and Control support, making it easier to encrypt, moderate and block sensitive or inappropriate e-mail based on specific sender, receiver and content attributes. Key functionality enables you to:

  • Combine Exchange 2010 and Active Directory Rights Management Services (ADRMS) so that you and your users can apply Information Rights Management protection automatically to restrict access and use of information within a message–wherever it is sent.
  • Enable partners and customers to read and reply to IRM-protected mail–even if they do not have Active Directory Rights Management Services (ADRMS) on premise
  • Enable managers to review mail and either approve or block transmission
For more infromation, please visit Microsoft’s website at  http://www.microsoft.com/exchange/2010/en/us/whats-new.aspx

Symantec Enterprise Vault Archiving Framework Overview

2017-07-27T00:01:09+00:00 August 23rd, 2010|Uncategorized|

Over 5,000 global customers use Enterprise Vault as an archiving framework to reduce risk and increase operational efficiency

Business is drowning in data. As part of the normal daily operation large numbers of emails and documents are generated; buyers may negotiate purchasing contracts over email, financial analysts may share data via Microsoft SharePoint Portal Server, and product managers may publish pricing and competitive information via file shares. All of these activities are fundamental to the smooth operation of the business and all generate business records, which, at some point, may be required for regulatory, litigation, or knowledge purposes.

Today’s complex storage and server infrastructures are not designed to cope with the efficient long-term retention of large volumes of data. The backup/recovery process becomes impossible within a reasonable timeframe. Keeping up with and exploiting the latest software capabilities raises the huge challenge of migrating legacy data. With IT budgets and resources stretched to the limit, the temptation to purge historical data is strong. But the increasing legal and regulatory demands mean this is not an option.

 

Flexibility and adaptability
Delivering value today, meeting your future needs

Without a flexible approach to storage the operational risks are overwhelming.  Without an adaptable platform for information exploitation the risk of non-compliance could be catastrophic. Imposing complex document retention solutions on your users will result in overload. Enterprise Vault is designed to deliver long-term, archival storage of multiple types of data. The information and data lifecycles are managed and relevant content can be found on demand.  A complete solution is delivered without disruption to the user, avoiding the need to change business processes and retrain staff.  The heart of this solution is the Enterprise Vault archiving framework.  Designed to offer secure, scalable information retention that is nonintrusive for the user, the framework is a natural evolution of Symantec’s market-leading archiving solution deployed around the world.

The archiving framework

Every organization is different. You need a solution that meets your business and operational needs, taking into account your current and potential future requirements. The archiving framework delivers the exact solution you require today, giving you the platform to meet current and future information retention requirements and exploit advances in storage and server technology. The framework consists of services grouped into a number of modules. By providing such a modular solution Symantec maximizes the value of your investment.

Components of archiving

The solution of the archiving problem can be broken down into a number of areas; each of these is represented by a layer in the archiving framework.

Enterprise Vault Repository

Central to every organization is the creation and ongoing management of the archive and its contents. This capability is delivered by the Enterprise Vault Repository. This is designed to deliver the capability to securely retain archived data in such a way that it can be fully exploited, and disposed of when it is no longer required. The Repository is designed to:

• Store archived content on the most appropriate platform

 • Compress and single-instance for storage reduction

 • Index content for rapid and targeted retrieval

 • Render an HTML copy of all archived content, thereby

   securing future accessibility

 • Utilize user authentication security controls

 • Define and implement retention and deletion policies

  

Content sources

Enterprise Vault is designed to store content from multiple data sources. Movement of content from the application to the Repository is carried out transparently via policy control. This ensures that fundamental records can be retained without affecting normal daily operations.

Open storage layer

The Repository provides for retention of the archive on the initial storage platform. Depending on the size of the organization and the nature of the content, there may be a need to use other types of storage, or indeed consolidate and partition archives for maximum efficiency. Your storage infrastructure may contain SAN and NAS technology and increasingly may embrace newer technologies such as EMC Centera and Network Appliance NearStore. The timescale for retention of content may be a few years, or a few decades. Today’s storage media may be regarded as legacy in the future. Storage lifecycle management

therefore is paramount to the long-term usefulness of archived content. To facilitate true lifecycle management the Open Storage Layer allows policy-based migration of archived data across storage media through time. This allows you to take advantage of the storage media best suited to the content’s age and usage profile, and reduce the total cost of storing the archive.

Performance and scalability infrastructure

The Enterprise Vault Repository is architected to provide the optimal performance for single or distributed server solutions. As the deployment environment becomes more complex, so it becomes important to have the ability to deploy components of Enterprise Vault in a distributed manner. Symantec and its partners have tested Enterprise Vault to beyond 100,000 users. A building block approach allows different servers to be used for archiving, indexing, journaling, and many other activities.

Universal access layer

Due to the wide range of data sources that can be archived, and the many ways in which end users work, it is important to provide an adaptable user interface. One of the key premises of Enterprise Vault is that end users should not have to change the way they work as a result of the implementation of archiving. Email clients should still be able to access current and historical email; file system content should still be accessible, and when working offline, archived email should still be available. To this end, Enterprise Vault has available a wide variety of client options:

• Transparent shortcuts in Microsoft Outlook®

• Universal shortcuts for non-Outlook clients

• Full access from Microsoft Outlook Web Access

• Transparent placeholders for archived file system content

• Hierarchical “Explorer” view across all archived sources

• Offline access for mobile and remote users

• Simple and extended full text search capabilities from

Outlook and Web interfaces The implementation of archiving and keeping of business records need not be intrusive for end users. They use familiar tools and continue to work as before.

Archive exploitation

Reducing the overall cost of storage should be only part of the TCO addressed by an archiving solution. Content is being kept because at some point in the future you will need to access it; otherwise, why would you keep it? Typically business records need to be accessed as part of a regulatory action, or as part of a litigation action. In both of these circumstances it is paramount that the information can be found very quickly and that only relevant information is disclosed. The ability to search the archive is only one part of the requirement. Locating the right set of content in response to litigation (perhaps as part of a contractual dispute) will require multiple searches. Search sets will require review by people with different levels of experience. An audit will be required of how decisions were made on what material to disclose. There is a process to this and it requires workflow. Enterprise Vault business accelerators deliver this workflow and drive the discovery or compliance process. Initially there are two business accelerators available within the framework:

• Discovery of content in response to litigation or a similar

   event (Discovery Accelerator)

• Regulatory supervision in line with finance regulations

   (Compliance Accelerator)

Framework benefits


• Built upon proven solutions from the market leader
• Supports many different content sources, not just email
• Nonintrusive for the end user
• Flexible deployment options
• Complete exploitation of archived content
• Addresses operational efficiency and compliance
• Enables information and storage lifecycle management
• Adaptable for future needs
• Reduces cost, reduces risk

Both Accelerators use the inherent search capability of Enterprise Vault to gather records that may be relevant to a case. The records are then assigned to a reviewer who is responsible for deciding whether the records are relevant to the case in hand. If in doubt, the decision can be cascaded to more experienced reviewers. The end result is a set of business records of absolute relevance, which may then be packaged for disclosure. Throughout the process an audit trail is retained.

Archiving for compliance, records management and

operational efficiency

Reducing risk and increasing efficiency

Enterprise Vault provides a framework of solutions that will increase your operational efficiency and decrease your business risk. Compliance is not just about email retention. Reducing cost is not just about storage optimization. Symantec helps you move beyond simply archiving email. We work with you to deliver a solution that allows email and other critical content sources to be archived and exploited on demand. This approach brings operational benefit to the data sources and preserves records for future use. Today your organization is looking for an application to underpin your compliance initiatives and reduce the future TCO of your storage and information infrastructure. Enterprise Vault is the proven solution.

Coretek Services is a Michigan based Systems Integration and IT consulting company that works with virtualization infrastructure, and is also a Symantec Gold Partner specializing in Symantec Enterprise Vault (SEV), Symantec Backup Exec (SBE), and Symantec Endpoint Protection (SEP).  Please contact us today for any virtualization requirements, or Symantec Product requirements.

 

Source: Symantec.com

Coming This Fall: Windows Azure Cloud Appliances

2017-07-27T00:01:09+00:00 July 20th, 2010|Uncategorized|

Addressing one of the key objectives of cloud computing, Microsoft today said its Windows Azure platform will be available as an appliance that can run on customer and partner premises.

The company revealed plans to offer the Windows Azure Appliance at its Worldwide Partner Conference, which began today in Washington, D.C. The appliance, which Microsoft has talked up conceptually for several months, will be offered later this year by key partners — initially Dell, Fujitsu and Hewlett-Packard Co. The appliance will enable private clouds based on huge turnkey systems equipped with the Windows Azure platform, server, storage and network infrastructure. eBay said it too will use the appliance.

“The Windows Azure appliance fundamentally takes the Windows Azure service and extends it,” said Bob Muglia, president of Microsoft’s Server and Tools business, speaking in the opening keynote of WPC. “It extends it to our service providers, allowing you to have exactly the same capabilities within your data center, providing that capability to your customers, and it can be extended to our larger customers that want to provide IT services within their own organizations.”

Details of the new appliance were vague, including cost, configuration and how they will be rolled out to customers. Muglia did say the new appliance is based on Windows Azure and SQL Azure with hardware specified by Microsoft, allowing service providers to either offer their own hosted Azure-based services or provision the appliances initially to large data center customers on-premise. The availability of such private cloud implementations addresses issues of control and compliance that have made cloud computing unfeasible to many corporate and government customers.

“The benefits are associated with control, compliance and keeping the data locally, data sovereignty. These are important benefits that allow for much more extensive solutions being built around this cloud environment,” Muglia said.

For eBay, the appliance will ease deployment without moving its huge auction and PayPal payment processing service off premises. “If I want to deploy an application today for eBay.com within my data centers I need to secure the hardware, provision a network, hook up the load balancer and make it part of the infrastructure,” said James Barrese, eBay’s VP of Technology, speaking at a press conference following the keynote.

Dell, Fujitsu and HP will all offer the appliances later this year, based on pre-defined hardware specifications by Microsoft. The hardware vendors said they see opportunities for both offering hosting services to customers as well as selling systems to very large enterprises such as government agencies and large corporations.

Though the companies are not discussing the configurations, the initial implementations will house just shy of 1,000 servers, Muglia said. One partner that appeared totally surprised by the launch of the appliance was Harry Zarek, CEO of Compugen in Toronto. When confronted on camera by Jon Roskill, the new Corporate VP for Microsoft’s Worldwide Partner Group said, “We have been a Microsoft partner for 20 years, having gone through the traditional product resale and service support. We had a fear that this business was going to trickle through our hands and move into the data center. We had a big question what we would be left with. This is the missing link, this is the piece we need to give us the destination over the next few years, in the cloud, and we have an important role to play.”

Muglia said the cloud has forced Microsoft to reinvent itself and will require its partners to do the same. It’s a change that is inevitable, it is a change that allows us all to deliver new value, it’s a change that thankfully is not happening overnight, and it is a change we will embrace together,” he said.

Source: Redmondmag.com, By: Jeffrey Schwartz

Virtual Servers, Real Growth

2017-07-27T00:01:09+00:00 July 12th, 2010|Uncategorized|

 

If you follow tech industry trends, you’ve probably heard of cloud computing, an increasingly popular approach of delivering technology resources over the Internet rather than from on-site computer systems.

Chances are, you’re less familiar with virtualization — the obscure software that makes it all possible.

The concept is simple: rather than having computers run a single business application — and sit idle most of the time — virtualization software divides a system into several “virtual” machines, all running software in parallel.

The technology not only squeezes more work out of each computer, but makes large systems much more flexible, letting data-center techies easily deploy computing horsepower where it’s needed at a moment’s notice.

The approach cuts costs, reducing the amount of hardware, space and energy needed to power up large data centers. Maintaining these flexible systems is easier, too, because managing software and hardware centrally requires less tech support.

The benefits of virtualization have made cloud computing an economical alternative to traditional data centers.

“Without virtualization, there is no cloud,” said Charles King, principal analyst of Pund-IT.

That’s transforming the technology industry and boosting the fortunes of virtualization pioneers such as VMware (NYSE:VMW – News), Citrix Systems (NMS:CTXS), two of the best-performing stocks in IBD’s specialty enterprise software group. As of Friday, the group ranked No. 24 among IBD’s 197 Industry Groups, up from No. 121 three months ago.

1. Business

Specialty enterprise software represents a small but fast-growing segment of the overall software enterprise market, which according to market research firm Gartner is set to hit $229 billion this year.

As with most software, the segment is a high-margin business. With high upfront development costs but negligible manufacturing and distribution expenses, specialty software companies strive for mass-market appeal. Once developers recoup their initial development costs, additional sales represent pure profit.

Software developers also make money helping customers install and run their software, another high-margin business.

But competition is fierce. Unlike capital-intensive businesses, software companies require no factory, heavy equipment, storefront or inventory to launch. Low barriers to entry mean a constant stream of new competitors looking to out-innovate incumbents.

In addition to the virtualization firms, notable names in the group include CA Technologies (NMS:CA) and Compuware (NMS:CPWR).

All offer infrastructure software to manage data centers.

“Big-iron” mainframe computers began using virtualization in the 1970s, around the time when CA and Compuware were founded.

In the late 1990s, VMware brought the technology to low-cost systems running ordinary Intel (NMS:INTC) chips. VMware has since emerged as the dominant player in virtualization.

Citrix has added a twist to the concept, virtualizing desktop computers. Rather than installing workers’ operating system and applications on hundreds of PCs spread across the globe, companies can use the technology to run PCs from a bank of central servers. Workers, who access their virtual PCs over the Internet, don’t know the difference.

Microsoft (NMS:MSFT) has jumped in with its own virtualization product, HyperV, which it bundles free into Windows Server software packages. Oracle (NMS:ORCL) and Red Hat (NYSE:RHT – News) have launched virtualization products as well.

Meanwhile, CA and Compuware are racing to move beyond their mainframe roots to support virtualization and cloud-computing-enabled data centers. In February, CA said it would buy 3Tera to build services and deploy applications aimed at the cloud-computing market.

And Compuware bought privately held Gomez, Inc. last fall to manage cloud application performance.

Name Of The Game: Innovate. With a fast-moving market and steady influx of new competitors, keeping customers happy with good service and money-saving breakthroughs is vital.

2. Market

Nearly everyone who runs a corporate computer system is a potential buyer of virtualization software. Companies ramping up their information-technology purchases use the software to manage their sprawling infrastructure; others with limited budgets use it to squeeze more out of their existing systems.

Sales of server-virtualization software are set to grow 14% this year to $1.28 billion, according to a report by Lazard Capital Markets. Sales of software to manage virtual environments will grow 44% in 2010 to $1.88 billion.

Desktop virtualization revenue will rise 184% this year to $847.8 million. Citrix has the edge in this budding market with its XenDesktop product.

VMware is dominant among large enterprises, controlling about 85% of the server virtualization market. Microsoft is favored by small and midsize companies.

Virtualization is seen as “a strategic asset” for enabling cloud computing, and continues to gain momentum, says Lazard analyst Joel Fishbein.

VMware has the early-mover advantage in this market with its vSphere platform and has stayed ahead by adding new features such as data security and disaster recovery, analysts say.

But Citrix is partnering closely with Microsoft to take on VMware in virtualization.

3. Climate

Competition is heating up as companies scramble to adopt virtualization. Before 2009, just 30% of companies used virtualization, says analyst Fishbein. This year, that will double to 60%. Most of the gain is coming from small and midsize customers.

In addition, virtual servers are soon expected to more than double as a percentage of the overall server workload, from 18% today to 48% by 2012.

VMware says it can stay a step ahead of the pack by building new features into its products, says Dan Chu, VMware’s vice president of cloud infrastructure and services.

“We have a large technology lead with what we enable for our customers,” Chu said. “We are several years ahead of what the others are doing.”

Citrix CEO Mark Templeton says his firm’s broadening strategy — offering a variety of products with multiple licensing options and distribution channels — will grow sales.

“What’s going on is a massive shift in how computing gets delivered,” Templeton said. “In an environment that’s changing so dramatically, the highest-risk thing you can do is not act.”

4. Technology

The first virtualization boom stemmed from a shift over the last decade away from big expensive mainframes and minicomputers to massive banks of cheap Intel-powered machines. Virtualization gave these low-cost systems some of the high-end features of their pricier counterparts.

Virtualization software makers are betting on a second wave of growth fueled by the industrywide shift to cloud computing.

Technology managers use virtualization to run cloud computing in their own data centers. And large tech vendors such as Microsoft use the technology for cloud-computing services they sell to customers.

Dividing computers into isolated virtual machines gives cloud service providers the benefits of shared computing resources without the security downsides.

VMware has the early lead in virtualization. But the technology is quickly becoming a commodity as Microsoft and others bundle it into their broader platforms.

“VMware is known as a virtualization company, and Microsoft is a platform company,” said David Greschler, who heads up Microsoft’s virtualization efforts. “Their strategy is to sell virtualization, but our strategy is to make virtualization available as part of a larger platform at no extra cost.”

At the same time, a shift toward a world of cloud-computing services hosted by the likes of Microsoft, Amazon.com (NMS:AMZN) and Google (NMS:GOOG) could lead to fewer companies purchasing virtualization software themselves.

Source: Investor’s Business Daily

Microsoft Starts Windows Embedded Update Service

2017-07-27T00:01:09+00:00 July 2nd, 2010|Uncategorized|

Microsoft initiated a free Windows Embedded update service for device developers, which started on Monday.

The new Windows Embedded Developer Update (WEDU) service is currently available and can be accessed by downloading the software here. The software can be installed and run on Windows Vista Service Pack 2, Windows 7, Windows Server 2008 and Windows Server 2008 R2.

The WEDU service, which reduces the time developers have to spend searching for updates, currently provides updates only for Windows Embedded Standard 7 developers. Microsoft plans to add support for Windows Embedded Compact 7 “within the calendar year,” according to the company’s announcement. Windows Embedded is Microsoft’s family of componentized operating systems used to support thin clients and various devices.

Project managers can use WEDU (pronounced “we do”) to ensure that their teams have the most current development environments. Users of WEDU need to have administrative access privileges to manage the service.

To use WEDU, administrators specify the products that should receive updates by registering them through the service. The next step is to specify the locations of the distribution shares where the updates should be activated, according to an MSDN library article. WEDU will search for daily updates in the background. Administrators can also perform manual scans for new updates.

The service comes with a few caveats. While updates can be automated, the WEDU tool doesn’t let the user remove the updates. Windows Control Panel has to be used in those instances to remove “certain updates for developer tools,” according to the MSDN article. The article adds that “updates to distribution shares and repository databases cannot be removed.”

Microsoft provides advice on maintaining distribution shares and creating distribution shares in its blogs. The former blog recommends importing all Microsoft-released packages and updates and not removing packages from distribution shares. Distribution shares should be backed up before importing any updates.

Load More Posts