Virtualization Adapted Adapting Business Processes for Virtual Infrastrcuture (and vice-versa)


Using Cryptographic Hashes to verify file download integrity

Filed under: virtualization — Tags: , , , , , , , , , , — iben @ 10:58

The SHA hash functions are a set of cryptographic hash functions designed by the National Security Agency (NSA) and published by the NIST as a U.S. Federal Information Processing Standard. SHA stands for Secure Hash Algorithm.

Vendors provide a sha-1 hash for software downloads. This enables you to verify that your downloaded files are unaltered from the original.

To confirm file integrity, use an sha-1 utility on your computer to calculate your own hash for files downloaded from the VMware web site.

If your calculated hash matches the message digest we provide, you are assured that the file was downloaded intact.

sha-1 utilities are available for Windows and Linux and Mac. Most UNIX installations provide a sha1sum command for sha-1 hashes. You may need a newer linux kernel to calculate the checksums for larger files.

The File Checksum Integrity Verifier (FCIV) can be used on Windows based products to verify sha-1 values. Please see for details on FCIV.

Mac OS X: How to Verify a SHA-1 Digest

Instructions on checking an sha-1 checksum on a Mac:
In Finder, browse to /Applications/Utilities.
Double-click on the Terminal icon. A Terminal window will appear.
In the Terminal window, type: “openssl sha1 ” (sha1 followed by a space).
Drag the downloaded file from the Finder into the Terminal window.
Click in the Terminal window, press the Return key, and compare the checksum displayed to the screen to the one on the vendor’s download page.

From TechNet

Windows Server 2008 R2 Standard, Enterprise, Datacenter, and Web (x64) – DVD (English)
File Name: en_windows_server_2008_r2_standard_enterprise_datacenter_web_x64_dvd_x15-50365.iso
Size: 2,858 (MB)
Date Published (UTC): 8/31/2009 10:22:24 AM
Last Updated (UTC): 1/11/2010 4:31:40 PM
SHA1: A548D6743129F2A02C907D2758773A1F6BB1BCD7
 ISO/CRC: 8F94460B

About MD5

MD5 was designed by Ron Rivest in 1991 to replace an earlier hash function, MD4. In 1996, a flaw was found with the design of MD5. While it was not a clearly fatal weakness, cryptographers began recommending the use of other algorithms, such as SHA-1 (which has since been found also to be vulnerable). In 2004, more serious flaws were discovered, making further use of the algorithm for security purposes questionable; specifically, a group of researchers described how to create a pair of files that share the same MD5 checksum. Further advances were made in breaking MD5 in 2005, 2006, and 2007. In an attack on MD5 published in December 2008, a group of researchers used this technique to fake SSL certificate validity.

US-CERT says MD5 “should be considered cryptographically broken and unsuitable for further use,”and most U.S. government applications now require the SHA-2 family of hash functions.


virtualized active directory domain services

Filed under: virtualization — Tags: , , , , — iben @ 00:21

There are many customers we’re setup with virtualized active directory domain controllers. Windows 2003 at first and now Windows 2008 both work fine as Virtualized Domain Controllers.

Here are some of the links and notes that help as references…


An anti-affinity DRS rule is used when you want to keep 2 virtual machines on separate hosts when they provide a redundant service and locating them on the same host would eliminate that redundancy.


The Virtual Machine on 64-Bit Windows Server

If using the x64 version of Windows Server 2003 or 2003 R2, one of the primary goals will be to contain the entire Active Directory database within the virtual machine’s RAM cache. On 64-bit Windows, employing 16 GB of RAM cache will accommodate a database of approximately 2.5 million users.
Caching the Active Directory database in 64-bit Windows will avoid performance hits related to certain disk operations. For a virtual machine that is a domain controller, adding, modifying, searching, deleting and update operations generally benefit significantly from caching. Write operations will always incur a slight penalty, regardless of whether a domain controller is running on a physical or virtual machine.
There is limited benefit for filling cache on 32-bit Windows for customers with large directories; in fact, in some cases this actually can exhaust kernel resources.

–> http:/

–> http:/
First Published: 17 June 2009
Windows 2008 Server and Windows 2008 Server R2 further refine the functionality with the service being renamed Active Directory Domain Services.

This article describes a condition that occurs when a domain controller that is running Microsoft Windows 2000 or Microsoft Windows Server 2003 starts from an Active Directory database that has been incorrectly restored or copied into place. This condition is known as an update sequence number rollback, or USN rollback. When a USN rollback occurs, modifications to objects and attributes that occur on one domain controller do not replicate to other domain controllers in the forest. Because replication partners believe that they have an up-to-date copy of the Active Directory database, monitoring and troubleshooting tools such as Repadmin.exe do not report any replication errors.

Here is a link to a VMworld 2006 Presentation titled TAC 9710 –
Virtualizing a Windows Active Directory Domain Infrastructure:
* Clock synchronization
* Network performance
* Multi-master replication model
* Security
* Potential single point of failure
* Disaster recovery


# To help prevent a potential update sequence number (USN) rollback situation, see Appendix A: Virtualized Domain Controllers and Replication Issues.


— I b e n
iben.rodriguez – gmail
Follow me on


Microsot SQL Server Consolidation Worksheet – information

Filed under: virtualization — Tags: , — iben @ 12:33

Complete a Worksheet like the one provided by Microsoft to get a better idea of any consolidation opportunities. Microsoft has provided a SQL Server consolidation Worksheet to assist with the process of consolidating SQL Servers and it can be found at:
* Server product and version
* Perfmon – Read / Write ratios (2 Weeks of data minimum at 5 Minute averages)
* Perfmon – Memory utilization (2 Weeks of data minimum at 5 Minute averages)
* Perfmon – Disk I/O (2 Weeks of data minimum at 5 Minute averages)
* Perfmon – Network I/O (2 Weeks of data minimum at 5 Minute averages)
* Security context of databases
* High Availability requirements / Clustering
* Limitations that prevent clustering
* Stability of servers
* Analysis add-ons
* Custom Stored Procedures
* OLTP and OLAP features and frequency of use
* Dependencies of Server / Instance names
* Life expectancy of each database / application / dependency
* Do the apps support Instance Names
* Do the apps have hard coded Server Names / IP’s
* Business continuance requirements
* How many databases
* Data growth rate
* Data Retention Policies
* Backup windows
* Backup technologies
* Change Management for Upgrades / Patching
* SAN technologies
* Peak usage / Low usage time windows of each server
* Location
* Replication frequency, duration, and volume
* SQL mail and other tool interaction
* Indexing / Natural Language Query
* Connectivity requirements
* Processor or Seat licensing
* Internet / Public Access vs. Internal only
* SLA’s to business units
* Who owns the servers (Business Units / Customers / IT Services, etc)

A typical database assessment can last 4 to 6 weeks if the resource has access to the servers and the answers to the questions above. Regardless of the number of servers, the assessment process is the same. You may be able to ballpark the number of servers early, but the actual count can only be determined by detailed analysis and thorough testing.


Powered by WordPress