PowerShell to download O365 IP ranges

 
$ipv4CsvFile = "${env:USERPROFILE}\Documents\O365_IPv4_Addresses.csv"
$ipv6CsvFile = "${env:USERPROFILE}\Documents\O365_IPv6_Addresses.csv"
 
[xml]$xml = ( New-Object System.Net.WebClient ).DownloadString( "https://support.content.office.net/en-us/static/O365IPAddresses.xml" )
 
$products = $xml.products.product
 
$ipList = $products.addresslist.Where( { ( $_.type -in ("IPv4","IPv6") ) -and ( $_.address -ne $null ) } )
 
$ipv4 = ($ipList.Where({ $_.Type -eq 'IPv4'})).address
$ipv6 = ($ipList.Where({ $_.Type -eq 'IPv6'})).address
 
$ipv4 | Out-File -Filepath $IPv4CSVFile
$ipv6 | Out-File -Filepath $IPv6CSVFile

megaport-pstools Released

https://bitbucket.org/cbrochere/megaport-pstools/src

megaport-pstools

PowerShell Tools for automation and scripting of Megaport services.

This started life with the purpose of figuring out how one might schedule a bandwidth change on a VXC, but then blew-up into various other tools to simplify other tasks and requests from user, such as exporting and graphing the bandwidth usage, or detecting interface/connection issues.

While the Megaport web UX at https://megaport.al is really great, simple and intuitive, it’s a pain having to click buttons over-and-over – and besides, It ain’t “DevOps-y” cool. There’s always a need for scripted automation with integration with other powershell suites such as the Azure PowerShell Tools.

Why PowerShell? -meh! why not? -Actually, I’ve just been spending a lot of time on Windows lately, running/writing Azure automation scripts, so it was pretty easy to write up a few test scenarios using the API and Invoke-RestMethod. By the time I completed testing 4-5 API endpoints, I was already reusing the majority of the same code, so it pretty much escalated/optimized from there.

Btw – PowerShell works on both Windows, Mac (untested) and Linux, VS Code is pretty and cool too 🙂

 

check_mk local check SSL certificate expiration

Was getting sick of tracking certificates expirations in confluence and setting reminders in my calendars, so I thought, Hey, why not make the monitoring system do this?

#!/usr/bin/perl
 
use strict;
use warnings;
use Net::SSL::ExpireDate;
use DateTime;
 
my $daysleft ;
my $endDate ;
my $dtnow = DateTime->now ;
my $status = { 'txt' =>; 'OK', val => 0 };
 
my @hosts ;
 
push @hosts, 'www1.exmaple.com'
 
foreach my $host (@hosts) {
        check_ssl_certificate($host);
}
 
sub check_ssl_certificate {
        my $host = shift;
        my $ed = Net::SSL::ExpireDate->new( https => "$host:443" ) ;
        if ( defined $ed->expire_date ) {
                $endDate = $ed->expire_date ;
                if ( $endDate >= DateTime->now ) {
                        $daysleft = $dtnow->delta_days($endDate)->delta_days ;
                        if ( $daysleft < 90 ) {
                                 $status = { 'txt' => 'WARNING', 'val' => 1 } ;
                        } elsif ( $daysleft <= 45 ) {
                                 $status = { 'txt' => 'CRITICAL', 'val' => 2 } ;
                        } else {
                                $status = { 'txt' => 'OK', 'val' => 0 } ;
                        }
                } else {
                        $status = { 'txt' => 'CRITICAL', 'val' => 2 } ;
                }
                print "$status->{val} SSL_Certificate_$host Days=$daysleft; $status->{txt} - $host Expires on $endDate ($daysleft days)n";
        }
}
 
exit($?);

Sorting images into Exif Date Taken folder

So tonight I decided i’d had enough dealing with backups and storage space contraints on my home PC so I figured Google Drive @ USD$4.99 p/month for 100GB, is a bargin for a little less stress in my life. This of course left me wondering what to actually upload.

I picked my photos to start, but was immediately faced with a problem. Well two actually:

  1. A large amount of images all share similar names across various folders
  2. Not all folders have photos sorted by date or by their actual Exif “Date taken” timestamp.
    Most of the photos are all sorted in to the yyy-mm-dd of the day they were extracted from the camera to the PC and over the years, some where just organized where-ever.

Solution?

Script something to fix it!

first thing to do was to rename the files uniquely and then move them into their respective yyyy-mm-dd folder based on the images exif date-taken timestamp.

 
#!/usr/bin/env perl
use 5.010;
use strict;
use warnings;
use autodie;
 
use Path::Class;
use File::Copy;
use Digest::MD5 'md5_hex';
use Image::ExifTool 'ImageInfo';
 
sub rename_by_exif_DateTaken {
# and make the filenames unique
     for my $f ( dir()->children ) {
          next if $f->is_dir;
          my $exif = Image::ExifTool->new;
          $exif->ExtractInfo($f->stringify);
          my $date = $exif->GetValue('DateTimeOriginal','PrintConv');
 
          next unless defined $date;
          $date =~ tr[ :][T-];
 
          my $digest = md5_hex($f->slurp);
          $digest = substr($digest,0,7);
          my $new_name = "$date-$digest.jpg";
 
          unless ( $f->basename eq $new_name ) {
               rename $f => $new_name;
          }
     }
}
 
rename_by_exif_DateTaken ;
 
sub sort_into_date_folders {
# yyyy-mm-dd
  for my $f ( dir()->children ) {
	next if $f->is_dir;
 
	my $exif = Image::ExifTool->new;
	   $exif->ExtractInfo($f->stringify);
 
	my $timestamp = $exif->GetValue('DateTimeOriginal', 'PrintConv');
		next unless defined $timestamp;
		$timestamp =~ tr[ :][T-];
	my ($date, undef) = split /T/, $timestamp;
 
	print "$daten";
 
	if ( ! -d $date ) {
		mkdir($date);
	}
 
	print "moving $f -> $date/$fn";
	move("$f", "$date/$f");
 
  }
}
 
sort_into_date_folders ;

kudos to David Golden for the original inspiration.