TechOnTip Weblog

Run book for Technocrats

Archive for July, 2012

System Administrator Appreciation Day

Posted by Brajesh Panda on July 27, 2012

My Love & Respect….

July 27, 2012 13th Annual
System Administrator
Appreciation Day

A sysadmin unpacked the server for this website from its box, installed an operating system, patched it for security, made sure the power and air conditioning was working in the server room, monitored it for stability, set up the software, and kept backups in case anything went wrong. All to serve this webpage.

A sysadmin installed the routers, laid the cables, configured the networks, set up the firewalls, and watched and guided the traffic for each hop of the network that runs over copper, fiber optic glass, and even the air itself to bring the Internet to your computer. All to make sure the webpage found its way from the server to your computer.

A sysadmin makes sure your network connection is safe, secure, open, and working. A sysadmin makes sure your computer is working in a healthy way on a healthy network. A sysadmin takes backups to guard against disaster both human and otherwise, holds the gates against security threats and crackers, and keeps the printers going no matter how many copies of the tax code someone from Accounting prints out. A sysadmin worries about spam, viruses, spyware, but also power outages, fires and floods.When the email server goes down at 2 AM on a Sunday, your sysadmin is paged, wakes up, and goes to work.

A sysadmin is a professional, who plans, worries, hacks, fixes, pushes, advocates, protects and creates good computer networks, to get you your data, to help you do work — to bring the potential of computing ever closer to reality.

So if you can read this, thank your sysadmin — and know he or she is only one of dozens or possibly hundreds whose work brings you the email from your aunt on the West Coast, the instant message from your son at college, the free phone call from the friend in Australia, and this webpage.

Show your appreciation

Friday, July 27, 2012, is the 13th annual System Administrator Appreciation Day. On this special international day, give your System Administrator something that shows that you truly appreciate their hard work and dedication. (All day Friday, 24 hours, your own local time-zone).

Let’s face it, System Administrators get no respect 364 days a year. This is the day that all fellow System Administrators across the globe, will be showered with expensive sports cars and large piles of cash in appreciation of their diligent work. But seriously, we are asking for a nice token gift and some public acknowledgement. It’s the least you could do.

Consider all the daunting tasks and long hours (weekends too.) Let’s be honest, sometimes we don’t know our System Administrators as well as they know us. Remember this is one day to recognize your System Administrator for their workplace contributions and to promote professional excellence. Thank them for all the things they do for you and your business.


Posted in Mix & Match | Tagged: | Leave a Comment »

IIS 7.5 FTP Administration Automation Powershell Script (APPCMD)

Posted by Brajesh Panda on July 26, 2012

Here is my FTP user provisioning script for IIS 7.5 FTP server. Later I will publish configuration steps for FTP & FTPS Server.

Most of the below command strings are not from PowerShell. But I used in a PowerShell script because it is easy to pass variables etc. APPCMD.exe is a new command line tool for IIS 7 & above. You will find it in C:\windows\system32\inetsrv. To make it work in your script make sure you configured environmnet variables.

Earlier it used to take 5mins to set up a user account mannually & always I used to forget to do something, resulting frustration. Now it is in seconds & robotic. Isn’t it awesome 😉

# Capture FTP User Details

$UserLogonName=read-host “Enter Logon Name”

$UserPassword=read-host “Enter Password”

$UserFullName=read-host “Enter Full Name”

$UserDescription=read-host “Who use this account?”

# Create Local FTP User, configuring Account Never Expire, User Can’t change password

net user $UserLogonName $UserPassword /add /passwordchg:no /expires:never /active:yes /fullname:$UserFullName /comment:$UserDescription

# Set the FTP user account’s password not to expire using WMI in Powershell

$WMI = gwmi win32_useraccount | where {$ -eq $UserLogonName}

$WMI.PasswordExpires = $False


# Add FTP User to FTP Users group. This group has permission to connect to this FTP server

net localgroup FTPUsers $UserLogonName /add

# Create FTP Directory for the above FTP user

mkdir C:\WEBRoot\Colliers-International_Com\$UserLogonName

# Create FTP Virtual Directory

appcmd add vdir /”FTP_Server/” /Path:/$UserLogonName /physicalpath:C:\WEBRoot\$UserLogonName

# Remove FTP Users group from the Virtual Directory. So nobody will able to access this folder

appcmd set config “FTP_Server/$UserLogonName” -section:system.ftpServer/security/authorization /-“[roles=’FTPUsers’]” /commit:apphost

# Add above FTP user to virtual directory authorization list with read & write permission

appcmd set config “FTP_Server/$UserLogonName” -section:system.ftpServer/security/authorization /+”[accessType=’Allow’,users=’$UserLogonName’,permissions=’Read, Write’]” /commit:apphost

# Automatically open FTP server using windows explorer

explorer.exe ftp://<URL>

Posted in IIS | Tagged: | 2 Comments »

Stale Computer Accounts

Posted by Brajesh Panda on July 23, 2012

Below script will help us to determine if the computer account is a stale account. I have “Quest Active Roles Management Shell for Active Directory”.

I am importing computer names from computers.txt; Using Quest PowerShell cmdlet to select specific required attributes, calculating & sorting as per password. Lastly exporting to CSV file.

Get-Content .\computers.txt | ForEach {Get-QADComputer -IncludeAllProperties $_ | select name, pwdlastset, lastLogonTimestamp, @{n=”PWAge”;e={((get-date)- $_.pwdLastSet).days}}, whencreated, WhenChanged, operatingSystem, operatingSystemVersion, dNSHostName, distinguishedName, Guid, Sid} | sort PWAge -desc | Export-Csv cmp1.csv -NoTypeInformation –Force

Search Active Directory for stale computer accounts which didn’t report from last 90days;

$StaleAge = (Get-Date).AddDays(-90)
Get-QADComputer -IncludeAllProperties -SizeLimit 0 | where {$_.pwdLastSet -le $StaleAge -and $_.operatingSystem -notlike “*Server*”} |select name, AccountIsDisabled, pwdlastset, lastLogonTimeStamp, @{n=”PasswordAge”;e={((get-date)- $_.pwdLastSet).days}}, whencreated, WhenChanged, operatingSystem, operatingSystemVersion, dNSHostName, ParentContainer | sort ParentContainer -desc | Export-Csv -NoTypeInformation c:\StaleComputerAccounts.csv

“-notlike” filters server operating sytem computer accounts

“AccountIsDisabled” shows which computer accounts are already disabled.

If you have large number of computer accounts; it is going to take a while. So you may like to send the output using a mail script;  Make sure your mail server can relay mails from the machine where you are running/scheduling the script

Send-MailMessage –From “” –To “” –Subject “Stale Computer Accounts (Workstations)” –Body “Find the attached stale computer accounts list. These computers were not reported to Contoso domain from last 90days or more. Please make sure your allocated Organizational Units & Default Computer container is cleaned.” -Attachments “c:\StaleComputerAccounts.csv” –SmtpServer

Here is a nice article

Posted in Active Directory User Management | Tagged: | Leave a Comment »

Windows 2012: Cut, Copy, Delete “Pause & Resume” Feature

Posted by Brajesh Panda on July 18, 2012

While using Windows Server 2012 & Windows 8 observed I can pause & resume cut, copy or delete operations. Isn’t it nice?

Another beauty; this feature lists all those running tasks in a single window rather than multiple windows! So no toggling hassle between windows 😉

Appreciate this long pending feature.

Posted in Windows 2012 | Leave a Comment »

Convert WIM to VHD or VHDX

Posted by Brajesh Panda on July 17, 2012

I hope you must have heard about WIM files. From the release of Vista these files are very much common in the OS installation process. WIM stands for Windows Imaging Format, means it is a file based disk image format. If you browse Vista, Win 7, Win 2k8 (R2) or Windows 2012 install media you will find boot.wim & Install.wim file. There are bunch of good articles about different benefits of WIM file like single instance image for multiple things, hardware independent etc. There are two tools which can be very usefull while working with wim images;

ImageX: To Create, Edit & Deploy WIM Images.

DISM: Deployment Image Service Management. Used to modify contents insides or retrieve what it contents.

So when we are installing Vista or later operating system, you don’t do file based installation like XP or older operating systems; where each file related to OS version get copied to hdd. Here OS installer just applies the WIM file to the disk. You may remember ghost images, it is somewhat same, apply the image to the target disk 😉 So technically we can apply the image file to a USB Drive or thumb drive & make it bootable. Only thing your hardware need to support to boot from that device.

Now days we are creating a lot virtual machines. Well we create golden virtual machine images & keep in our library (VHD/VMDK). And then we clone/deploy new virtual machines out of those images. But if we want to install brand new naked Windows virtual machines, we don’t need to go thru all the way thru installation process. We can convert the WIM file to a VHD file, it will create an out of factory syspreped VHD .

Let’s convert a Wim file to VHD;

You can download Convert-WindowsImage.ps1 script to convert the boot.wim file to a VHD/VHDX file. While you are downloading make sure you go thru the release notes which describes about each PowerShell parameters.

There used to be an old version called WIM2VHD.wsf script. But it has been re-written in PowerShell with VHDX support & lot of easy syntax.

If you don’t want to remember all PowerShell parameters, there is a GUI option;

  • Open PowerShell, change the directory to the folder where you downloaded above convert-windowsimage.ps1 script and run below command
  • .\Convert-WindowsImage.ps1 –ShowUI
  • It will open below UI image, populate/select necessary information & options & click make my vhd.
  • While it converting to VHD, you can see the process & progress in your PowerShell window. It took 3min to create my Windows 2012 VHDX file.
  • Then I used “new-vm -name Test2 -Path E:\VMStore\ -MemoryStartupBytes 512MB -SwitchName VM_Network -VHDPath e:\VMStore\test2\Win2k12DC.vhdx” to create my Windows 2012 VM
  • There is an “Unattended” Option; where you can input an unattended file so it will do necessary config in the target VM. We can club the conversion & VM creation to a single script. Bottom-line is we can do nice automation. Isn’t it awesome?? 😉

Posted in MsHyper-V | 8 Comments »

Windows 2012: Deduplication

Posted by Brajesh Panda on July 16, 2012

  • Here are my testing results of Deduplication. There is a nice Technet blog article, where you can find all theoretical details.
  • Dedup can be managed from server manager or PowerShell
  • There is a free tool “ddpeval.exe” comes with Windows 2012 to evaluate how much space we will save if we enable Dedup. You can copy to another computer (2008 or 7 family) and run it.
    • I am using 2012 release candidate build 8400
    • Supports only NTFS file system (No ReFS Support yet)
    • For my testing I have copied two install.wim files of 2.9GB each to F Drive

  • Enable Dedup by right clicking the volume & selecting configure deduplication or “enable-dedupvolume f:”
  • Then attach a schedule when you like to run the Dedup process on file system
  • If you want to start immediately run “start-dedupjob f: -type optimization”

  • After completion of job here is the server manager screenshot showing how much space we save. We can use “get-dedupstatus”
  • Wow we are saving 49%, isn’t awesome 😉

Posted in Windows 2012 | 1 Comment »

Fundamentals of Data Backup

Posted by Brajesh Panda on July 15, 2012

  1. Types of Backup
    1. Full
  • Complete backup of give data (file/folder/application data/database etc.)
  • Takes Long time to complete the backup as per data size
  • Advantage:     Only need a single backup set to restore the data
  1. Incremental
  • Backup of changed data from Last Backup
  • Advantage:    As amount of data is less, time required to complete incremental backup is less
  • Disadvantage: At the time of restore, you need Full Backup Media + All incremental backup Media; you need a lot of tape
  1. Differential
  • Backup of changed data from Last Full Backup
  • Advantage:    At the time of restore you need only two Media Set i.e. Last full backup + last differentials backup
  • Disadvantage: Differential takes more time because it backup all changes from last full backup
  1. Methods of Backup
    1. Online
  • Target data is live; users are using the data.
    • Like file server – people are already using & your backup software is taking backup
    • Like Oracle server – database is online/live, then your backup software is taking backup
  • During Online Backup as server or services are live, there is a chance of data change; means data is not consistent, data is dirty. So backup agents have special application awareness feature (like VSS etc.) to take consistent backup of data. So at the time of restore consistent good data can be restored.
  • Due to application awareness features backup tools/agents are expensive
  • In this method you don’t need any downtime for application. Because agent can take backup when production server is online & users are working.
  1. Offline
  • Target data is offline mode, no user is using data
    • Like Oracle Server – Database services are down, no user connection & you are just backing the database & log files from different directory.
  • No need of application awareness features like Online backup; because data is already in offline state & clean because no data change is happening during backup job
  • Due to this these softwares cheap.
  • In this method you need downtime for application. This is not possible in every environment.


  1. Types of Backup Media
    1. Portable
  • Those media which can be easily moved (ported) from one place to other .
  • Floppy, CD, DVD, Tape, USB Hard Disk
  1. Non-Portable
  • Those media which can’t be moved from one place to another
  • Backup Server Hard Disks, Connected storage


  1. Decision between Tape or Disk
    1. Tape
  • Tapes are based on magnetic tapes, so they are sequential on read & write
  • Tape Drives RPM (Rotation per Minute ) is less as comparison to hard disks.
  • Each tape has a definitive life period. Means after certain round of write, tapes get damaged.
  • Backup jobs which use tapes are called as Disk to Tape (D2T) backup jobs
  • Need another device to write into tapes; This brings in additional expense
    • Stand Alone Tape Drive
    • Tape Auto Loader
    • Tape Library
  • Tapes are oldest method to take backup & retain for many years
  • Tape size/capacity is limited as per their generation
    • DLT
    • LTO1 – 100GB
    • LTO2 – 200GB
    • LTO3 – 400GB
    • LTO4 – 800GB
    • LTO5 – 1500GB
  • (DLT – Digital Linear Tape)
  • (LTO – Linear Tape Open) 1 starts for Generation 1
  • Above capacity values are capacity of Tape in RAW format, if we compress data at the time of backup it can increase by 50% depending on backup software & what we are backing
  • While writing backup can be compressed to backup more data into the tape.
  • For Data Security Data can be encrypted
  1. Disk
  • Disks are based on electronic motor & data palters and provide random read & write. Random nature of disks provide more data throughput
  • Disks RPM is more than Tape helping in more data throughput
  • Data in disks can be stored for indefinite period
  • Backup jobs used Disk as Media target called as Disk to Disk (D2D) backup jobs


  1. Key Words (Jargons)
    1. Backup Server
  • This is a server where specialized backup server software runs & communicates to backup agents. Manages backup devices & helps to take backup.
  • There are lot of companies who provide backup server software
    • Symantec NetBackup
    • Symantec BackupExec
    • Commvault
    • CA Arcserve
    • EMC Networker
    • EMC Avamar
    • EMC DataDomain
    • Windows Backup Tool
  1. Backup Agents
  • Each Backup server comes with different types of agents.
  • These agents helps the backup server to communicate to source server & they ship selected data to backup server to them backed
  • Agents comes with application awareness feature, so consistent data can be backed up
  1. Backup Job
  • Backup job is a special task in the backup server, where you define what, when & where
  1. Backup Policy
  • If there are 1000 servers, as per backup job you have to create 1000 backup jobs & manage them on day to day basis ; which may not be possible.
  • So we create backup policies with When & Where these backups are going to run
  • And we connect this backup policy to target agents (what); to get the backup jobs created automatically.
  • Any time we want to change in When & Where, we just change in one policy; respective backup jobs get changed automatically
  • You can also configure duplicate backup jobs in the same policy without any complex or repetitive backup job configuration.
  1. Selection List
  • In every backup job we have to select source server & respective files in that server. This means what you are going to backup.
  • Selection list helps to work with backup policy.
  • You create a selection list & connect it to a specific backup policy
  • It is easy to select, unselect or remove any old selection from that list.
  1. Backup Throughput
  • It refers to data transfer rate from Agent to Backup Server Media device
  • Mostly measured in MB\m (Mega Bytes per minute)
  • It depends on multiple factors like source server utilization at the time of backup, source server network card speed & configuration, intermediate network switches & stack, backup server network card configuration, available network bandwidth for the backup server, how many backup jobs are running at a time etc.
  1. Media Set
  • One or Group of media where we have taken backup like tapes, disks etc.
  1. Media Retention
  • On a media set we can define retention means till when we would like to retain those backup.
  • This value depends on life period of the given data
  • Like Weekly, Monthly, Yearly so on
  • So after that period that media can be overwritten or re-cycled
  1. Media Server
  • This is the server which keeps or manages all media sets as per their retention period & at the time of backup jobs it presents those media to the backup server to use for backup
  • Media servers are mostly connected to tape drives, autoloaders, tape library and external disk storage systems


  1. Advance Technologies
    1. Duplicate Backup Set
  • If you want to copy same backup job to multiple media you need to run duplicate backup set jobs.
  • Like if you want to copy same backup set from backup Disk to tape or tape to tape
  1. De-Duplication of Backup
  • While backing up data most of the time we backup same data again & again wasting a lot of space
  • De-Duplication feature removes those duplicated files & just keep one copy of that
  • De-duplication jobs mostly implemented in D2D environment, where jobs are using a disk as media target.
  • De-duplications are two type
    • Source Side De-duplication
      • Backup Agents use this feature to detect duplication at the production server end using production server resources like CPU, Memory, Disk etc. And after duplication detection agent only sends a single copy of data to backup server to get backed up. It reduces network bandwidth but de-duplication process consumes server processing power at source, may be impacting production services.
    • Target Side De-duplication
      • Backup Server use this feature to detect duplication while backing into the target media set or after backup gets completed. So all data including duplicate files get transferred to backup server. Then backup server run de-duplication process on them., and only retains one copy of that data. It consumes network bandwidth utilization. This is mostly implemented solution these days.
  • Detection of duplicate files depends on backup software & vendor.

Posted in BackupExec 2010 | Leave a Comment »

Google Compute Engine ( Is it challenging to Amazon EC2?? )

Posted by Brajesh Panda on July 2, 2012

As per Om Malik’s Gigacom Article;

The focus of the Google Compute engine is on performance, scale and value. In order to show its performance and scale, Google is planning to show off a genomic app that runs on 600,000 cores. Another app will use 10,000 virtual machines. And if that isn’t enough, the company says it will offer 50 percent more compute resources compared to other shared cloud infrastructures. Translation: It’s a shot across the bow of Amazon Web Services’ EC2 offering.

If this is true I will say it is a Grid Computing not just cloud 😉 (whatever the marketing word is). You may like to take a look at my very 1st article about all these cloud, grid, saas, paas, iaas.

As per my understanding I can say, Google Compute Engine will be providing on demand computing services in terms of infrastructure (Infrastructure as a Service) from Google’s world wide data centers. Industry Pundits are saying pricing is going to give a hard time to EC2.

Here are some articles for your reading;

Product Page:

Product Pricing:

Google Blog:

Om Malik’s Article:

Posted in CloudComputing | Tagged: | Leave a Comment »

%d bloggers like this: