Newsletters

Customer Support:   (972) 395-3225

Home

Articles, News, Announcements - click Main News Page
Previous Story       Next Story
    
Question and Answer with Doug Ehrenreich: Managing Software Licenses for Virtual CRM Datacenters -- It's All about Saving Money!

January 10, 2013

Q&A:

Managing Software Licenses for Virtual CRM Datacenters – It’s All about Saving Money!


We conducted an interview with Doug Ehrenreich, iQuate’s VP Business Development to gain a deeper insight into the IT challenges around software asset management in datacenters. iQuate, a global IT inventory and discovery company, provide the technology that enables a complete, continual and accurate visibility of highly complex physical, virtual and cloud environments. Datacenters today are grappling how they are able to achieve verifiably accurate asset inventory to improve the critical business decisions that control costs and mitigate risks. In our furthering our discussion on this topic, Doug was able boil it down for us in providing the top challenges for software asset management (SAM) for datacenters today.


What do you perceive as the top three challenges for SAM in CRM datacenters using a cloud environment?


1st Challenge: Multi-core Servers

The innovations in multi-core micro-processors have shaped how enterprise software is used in the datacenter. Customers have taken advantage of the cost/performance curve to align compute capacity with user demands. Enterprise software companies recognized this and changed the way they licensed their products installed on multi-core processors to receive additional license value for their products. In most cases, IT was configuring and deploying applications on these servers without understanding the licensing implications. Unless customers received an audit notice, these organizations might not even have been aware of the licensing issue, or potential financial liability until it was too late. Traditional enterprise management tools installed in the datacenter were not designed to discover and inventory the in-depth software configuration details needed to reconcile what the organization has deployed. Therefore, the first challenge is getting visibility and accurate views of what has been deployed on the servers in order to understand how software assets are deployed.


2nd Challenge: Virtualization

The second challenge encompasses the rapid growth and use of server virtualization in the datacenter. Historically, you would license and install enterprise software applications on a specific server and license the application to run on the server. Or you would purchase a license for the number of users accessing the application on the server. Virtualization introduced the notion of “moving” applications across the number of potentially available servers or cluster of “virtual servers”. The 1:1 ratio of installing an application on a physical server has shifted to a 1:N ratio where “N” could be a range of virtual servers or cluster of servers with network storage. The dynamic nature of matching the “total server compute capacity” with enterprise application computing demands has complicated datacenter asset management and measurement requirements. How applications were installed and configured today will look different 30 days from now because of the “dynamic” nature that virtualization allows. This has been recognized by enterprise software vendors and they have adapted their licensing terms to address it, for example server affinity and sub-capacity licensing. So now, IT needs to figure out how to monitor these new requirements as it applies to license compliance and the trends or changes over time since it is no longer static, and could have impact on licensing costs.


3rd Challenge: the 80/20 rule

SAM tools available today that have been designed to manage PC/desktop software deployments and in most cases, enterprises have tools that solve this problem. Yet, the majority of enterprises spend is in server, storage and enterprise software applications installed in the datacenter from the major enterprise software and hardware vendors. This seems to have been ignored, and where significant cost savings can be achieved.


CRM has migrated itself to being a relationship management tool. It is very different to what was the traditional use of CRM with access to customer details through a customer database. In fact, Microsoft now refers to CRM as xRM – “anything relationship management” with corporate processes automated through workflow management. With the evolving role of CRM and growing use of datacenters deploying cloud technology, what are the most important factors that make up a robust software compliance strategy?

Our discussions with customers, CIO’s and Vice Presidents of IT reveal they recognize that in order to modernize their software compliance strategy three basic things need to happen. First, they need to have confidence in the accuracy, completeness and visibility of their “virtual” datacenter asset data.

Second, once they have visibility of this asset data, they need to improve their system processes and IT workflows across the organization. Third, then the integration of the application workflow and process tools supporting the business operation can occur.

I do not believe that it is specific to CRM or xRM, but rather an agreement between IT and the business operations on the separation and abstraction of the underlying elements from the process flows. For example, look at how CTI, Call Center technologies were designed in the 1990’s and the impact of a web technology stack has had on how “CRM” is designed and accessed today. The point being that SAM architectures need to unravel, or uncouple proprietary “silo’ed” solutions into more efficient layered architectures, basically a “cloud” architecture.


As more sensitive data and business-critical processes move to cloud environments, cloud compliance and security seem to becoming a higher priority in managing datacenters. How pervasive is the usage of SAM in datacenters today and do you see this as a growing market for iQuate? What verticals are your sweet spot?

iQuate’s focus is helping solve the discovery, and inventory challenges within large enterprises having multiple datacenter locations. Security is an important facet discussed, and making sure that data is not compromised is a constraint, especially within a cloud environment. SAM in the datacenter for multi-core, virtualized datacenters is a growing market, and becoming a high priority. As enterprises attempt to remove costs from their operations, we believe private cloud services is an iQuate “sweet spot” within industries like Financial Services, Insurance, Banking, Healthcare and Service Providers.


In cloud environments, one of the most pervasive and fundamental challenges for organizations in demonstrating policy compliance is proving that the physical and virtual infrastructure of the cloud can be trusted – How does iQuate address this issue? This also particularly happens when those infrastructure components are owned and managed by external service providers. How does iQuate address this added complexity?

One of the unique strengths of iQuate’s product is our ability to auto-discovery and map the physical to virtual infrastructure across Unix, Linux and Windows operating systems. The virtualization support is for VMware, Hyper-V, IBM LPARs, Solaris Zones. The iQSonar product will provide the cluster and partitioning information as well. The fidelity and accuracy of this information is very important to get correct because how database, middleware and enterprise application instances get counted, measured and licensed is based on this metric. If enterprises do not have accurate data, their software compliance liability could be significant.

iQSonar will automatically scan and collect all this information in an “agentless” mode. Once the data is collected it is stored in an inventory database so that customers can begin to manage deployments and IT planning needs. One element of the architecture can be used to create a chargeback function within a “private cloud” environment. When installed within an environment managed by external service providers the constraint is really focused on security policies and protecting and securing the data to report back to the application software vendors.

Many of the organizations we talk with want to be able to can scan their virtual datacenter, collect all the virtualization information and enterprise applications, and in many cases review how they could reduce their Oracle licensing, maintenance and support costs using different hardware platforms and microprocessor types.

Customers use iQSonar to scan and inventory this information and iQAnalytics will get populated with real operational data. This enables customers can begin to review the impact of different scenarios such as the cost of doing nothing, or migrating to Cisco UCS, or an Intel rack servers. Customers can determine the allocation and optimization of configuring VMs in order to reduce the overall number of cores needed to be licensed.






Share Your Comment: Click here and type comments

 
Return to main news page