Unix script for finding unexported LUNs on 3PAR

Here is a script I threw together to quickly find all virtual volumes that are either exported to a hostgroup that has no active connections or find VVs that are not exported to any hostgroups.
This script also assumes you have your sshkeys setup in order to bypass typing in a password.

Also be aware this will grab ALL VVs, including any system VVs and file persona VV (see images below).

##Pass an array name to the command
##unexported.sh logs into the array and runs showvlun -a, showvuln -t, and showvv
##then it will compare the files and show the volumes that are exported but have no paths, and the volumes that are not exported
##ex: ./unexported <3parname or ip>
now=`date +"%m.%d.%Y.%H%M%S"`
arrayname=$1
showvluna=$arrayname.showvluna.$now
showvlunt=$arrayname.showvlunt.$now
showvv=$arrayname.showvv.$now
offline=$arrayname.offline.$now
unexported=$arrayname.unexported.$now
printf "\nSHOWVLUN -a\n"
ssh -T $arrayname "showvlun -a" | awk '{print $2}' | egrep -v 'VVName|total|VVname|^$' | sort -u | tee $arrayname.showvluna.$now
printf "\nSHOWVLUN -t\n"
ssh -T $arrayname "showvlun -t" | awk '{print $2}' | egrep -v 'VVName|total|VVname|^$' | sort -u | tee $arrayname.showvlunt.$now
printf "\nSHOWVV\n"
ssh -T $arrayname "showvv" | awk '{print $2}' | egrep -v 'VVName|total|VVname|Name|MB|virtual|^$' | tee $arrayname.showvv.$now
printf "\nLUN MAPPED, HOST OFFLINE\n"
grep -Fxvf $showvluna $showvlunt | tee $arrayname.offline.$now
printf "\nUNEXPORTED VOLUMES\n"
grep -Fxvf $showvlunt $showvv | tee $arrayname.unexported.$now
rm $showvlunt
rm $showvluna
rm $showvv

If you don’t want the script to give output after each login, change | tee to | >

 

Here is a redacted sample (note system vv “admin”):

An example containing file persona VVs:

Powershell: Learning to use the pipeline

This chapter goes over using the pipeline command to string together PowerShell commands.

The important commands covered are the following:

Export-Csv
Export-Clixml
Out-File
Out-Printer
Out-GridView
Compare-Object
ConvertTo-HTML

This chapter covered some pretty useful material.  It showed how to take output from something like Get-Process or Get-Service and exporting it to an easy to read file.

Here are some examples of the commands in action:

Export-Csv

Export-Csv: I included the Import-Csv command here so you could see what the results were and piped it to more

 

Export-CliXml

Here”s the same example with Export-CliXml

Get-Service pipeline Out-GridView

Out-GridView: I liked this command a lot; it requires that you have PowerShell ISE and .Net Framework 3.5.1 installed.  This command allows you to take the output from a PowerShell command and sends it to an interactive table in a separate window.

Compare-Object

Compare-Object: Or Diff for short may be one of the most useful commands in this chapter.  This command allows you to take the output from something like Get-Service output it to a file (like with Export-Csv) and compare the differences on another box!
I ran Get-Process on my Windows 2012 server and then compared it with the Get-Proess (ps) output on a Windows 2008 server

PowerShell: My introduction to Learning PowerShell

I’ve been reading Learn Windows Powershell 3 in a Month of Lunches.  I’m going to do a summary of the chapters I’ve already read, and a more detailed entry on the chapter I’m currently reading.

This book is designed to be read in shorter increments; this is not something that you try to power through and get as much done as fast as possible.  The authors have the idea that you should be able to read this and do the labs in 30-45 minutes, and over a month of lunches you should have a decent grasp on how to use PowerShell.

Chapter 1 and 2 is some introduction stuff to help you get acquanited with basic terminology and getting the shell set up the way you like it.  The most useful things in these chapters is how to set up PowerShell to your preference, and finding out how to determine your version of PowerShell by running $PSVersionTable

Chapter 3 goes over how to use the help file.  PowerShell has a built-in help file that needs to be updated as soon as possible, using the update-help command, in order to get the full experience from PowerShell.  You can query the help file various ways, but the most familiar is just simply typing in help and the command.  If you don’t know the command but know it contains a certain word then help and the word surrounded by the asterisk “*” will search the help file for any commands that contain that word and display the list (ex. help *service*).

Chapter 4 is the chapter I’m doing today; it goes over running commands and the basic structure of them.  Microsoft designed PowerShell with some good ideas in mind, their command structure makes it pretty easy to find a command even if you don’t know exactly what you want to do without ever having to leave the shell (I know how much you like to use Google).  The commands are structured to start with a standard verb, and I’m quoting this from the book, “like Get or Set or New or Pause”, so if you went over chapter 3 now you have a decent idea of how to search for a command.

Some of the commands can get pretty cumbersome so PowerShell has a lot of aliases built into it (you can even set your own using New-Alias), if you want to see if a command you run often has an alias then try the get-alias command (ex. get-alias -definition “get-service” or get-alias -definition *service*).  Aliases function exactly the same as the normally typed out command, they’re just a lot shorter.

An interseting feature built into PowerShell V3 is the Show-Command cmdlet.  This cmdlet opens up a graphical prompt showing the command parameters and allows you to fill them out.

get-command show-eventlog

Show-Command

This was cool enough I took a screenshot

Syntax comes out right every time!

Show-Command only works with single commands so if you’re stringing them together, it can only help with one at a time.

The error handling in PowerShell is pretty useful, too.  Almost every error that you encounter has a good explanation of why it is yelling at you and what it thinks you may have done.  Usually the best idea is to refer back to the help file to see what you did wrong.

Useful commands ran as written:
Search for a parameter alias: (get-command get-eventlog | select -ExpandProperty parameters).computername.aliases
Sends ICMP echo request packets (“pings”) to one or more computers. Test-Connection

HCS 8 Error: KAIC15359-E

      No Comments on HCS 8 Error: KAIC15359-E

Ran into another interesting little issue or nuance last week with Hitachi Command Suite’s (HCS) Data Retention Utility (DRU).

“One or more volumes (00:AF:00, 00:AF:01, 00:AF:02, 00:AF:03, 00:AF:04)
cannot be unallocated. The last path of one or more volumes (00:AF:00,
00:AF:01, 00:AF:02, 00:AF:03, 00:AF:04) that are specified in copy pairs
cannot be unallocated. Release the copy pairs and retry. (KAIC18149-E)”

This error cropped up after I received a request to decommission some servers.  Usually all this consists of is deleting the host groups and reclaiming the storage off of the array.  For some reason these particular servers were giving me fits.

Anyone that has dealt with local or remote replication on a Hitachi VSP will be familiar with the error above, and the fix is to find and split/delete the pairs.  So I split the pairs and I am able to successfully delete the host groups, but wait, what’s this?  I can’t delete the volumes now?

“Error: These volumes cannot be specified: 00:AF:00, 00:AF:03, 00:AF:01,
00:AF:02, 00:AF:04. Volumes for which replication control is set cannot be
deleted. Use different volumes. (KAIC15359-E)”

But I just split them and deleted the pairs?

I logged into Storage Navigator for this particular array to see if I could get some more information and got this error instead

“The selected LDEV has an access attribute setting.  Check the setting(s).” Error code: 03022-105143″

Okay, that’s not very helpful, actually…
So I start digging in all the old applications within Storage Navigator to try and track down this “access attribute”, due to past experience my first thought is to go look in the “Data Retention” utility (DRU).

Slight tangent, if you manage to have a DP/DT pool hit 100% used then any volumes that have a write activity attempted on them will be set to “Protected” in the DRU preventing you from doing pretty much anything with this volume until the issue is addressed.  This behavior was similar so this is where I went.

If you’ve never used this utility I don’t recommend playing around in it, you can cause some major problems for yourself if you’re not careful

DRU

Data Retention Utility

So here we are in DRU and everything appears to be fine. (Note the volumes in the error above are gone, because I didn’t think to screenshot this while I was having the issue)

DRU Vol Status

Inside DRU

So from here I went on a tangent looking in all manner of places, so to save you some time I recommend just looking at some other volumes that you’re pretty sure are right.  You’ll notice here that something IS different from the picture above.
DRU Correct Vol Status

Notice how S-VOL is enabled here?  Yeah, that was the problem.  Apparently, depending on how and where you set up and break your ShadowImage pairs, this setting may or may not set itself back to the enabled state.  My understanding is that “S-VOL disabled” indicates that the volumes are already in a pair and therefore can’t be used for any other array operations until it is set back, this includes deleting the volumes.

Junior National Championships for Olympic Weightlifting 2015

I had the privilege of volunteering at the Junior National Championships for Olympic Weightlifting in Oklahoma City this past weekend.

The entire event was ran very well thanks in no small part to Ryan Self and Jeremy Rutledge.  These two organized and gathered up the volunteers that were necessary for this event to be possible, and as far as I’m concerned it was smoothest and best meet I’ve ever been a part of.

I was lucky enough to witness many American records get broken, some multiple times, throughout the meet.  Humbled by the talent that was on display at this event, I witnessed many very talented female competitors who weigh quite a bit less than myself who can definitely outlift me.

I even managed to sneak my way into a hookgrip video featuring Maddy Myers during her 108kg clean and jerk while supervising the loaders at the side of the platform.

(I’m the bald guy; credit to hookgrip, obviously)

Notable lifters that I personally got to see were Megan Seegert, Maddy Myers, Mattie Rogers, Tyera Zweygardt, Jessie Bradley, Jaclyn Beed (hometown friend), Marissa Klingseis, Angelica Figueroa, Nathan Damron, CJ Cummings, Omar Cummings, and Michael Hunt.  Unfortunately I missed a lot of lifters due to either being unable to make it to the sessions, or because I was Marshaling.  You can find the final results here.

Here are a few pictures I managed to grab while I was there:


I was a bit too distracted watching the meet to actually take more pictures.

All in all the meet was great for motivating me to continue to improve.  I may never be a national level competitor, but that won’t stop me from trying.

HCS 8 Error: KAIC16036-E

      No Comments on HCS 8 Error: KAIC16036-E

I came across an issue in Hitachi Command Suite (HCS) at work that I wanted to talk about in the hopes of helping out other admins who search for this error.

I am a storage administrator and am primarily responsible for Hitachi storage arrays.  While trying to balance our subscription rates in some of our DP pools using the “Migrate data wizard” or migration tool in the tiers section I encountered this.

The issue arises when you are asked to choose your migration targets and choose the “Create new DP volumes” option.

HCS: Migration Source

Everything is fine here

Then we go to choose our targets and get this: Error: Insufficient LDEV IDs are available. Delete 1 unnecessary volumes, or register LDEV IDs to the resource group. (KAIC16036-E)

HCS: Migration Target

The error in question: KAIC16036-E

If you’re like me and have potentially hundreds of volumes of varying sizes then creating them all individually in the target pool can be a daunting task, but is also a workaround for this

The solution?

The error KAIC16036-E displayed in [2. Migration Target] of the Migrate Data wizard When you click the [Next>] button in the [2. Migration Target] screen, the error message KAIC16036-E might be displayed even though there are sufficient LDEV IDs. If there are sufficient LDEV IDs, wait a while without changing anything, and then click the [Next>] button again. If the same error message KAIC16036-E is displayed, cancel the Migrate Data wizard, wait a while, and then retry the operation.

The above is supposedly in HCS 8.1.2 release notes.

So basically “wait a while”, and I”ve seen that waiting 3-5 minutes tends to work.  I”m assuming this is an issue with HCS 8 having some lag time when querying the array for a list of available LDEV IDs, but waiting does seem to work in my case at least.

HCS: Migration Pair

Disclaimer: I never encountered this issue when using HCS 7.  We are currently running Hitachi Command Suite 8 and as far as I know it is still an issue in the newest release of HCS 8.1.2

Another shot at blogging

      No Comments on Another shot at blogging

I decided to give blogging another try in attempts to catalog some some new experiences and projects I have going.  I also happen to be paying for the domain so I figure I might as well be using it.

Back when I was blogging semi consistently I was running things through Drupal.  I decided to do some experimenting with code and whatnot and “accidentally” nuked the site.  “Fear not!” I thought, “I have a backup!” (untested) and turns out it didn’t work.

I became very discouraged because I lost a few good posts about some troubleshooting articles I had written in the past and put the blogging thing on hold at that point.

I happened to be poking around and decided to take another look at the backup I had of my Drupal stuff and I managed to track down the raw html of the posts I had made deep inside some folders.

So here I am, back to try it again.  I will be posting some of the old posts I salvaged that I believe had some merit and will continue to post about some new work developments and projects I’ve started recently.