Automate Using Powershell
Automate Using Powershell
Automate Using Powershell
How to Automate
Using PowerShell
How to Automate Tasks, File
Transfers, and Data Security
Introduction
Increased worker output is on the wish list of every organization out there.
However, with budgets tightening and work demands increasing, this can prove
challenging. IT is already stretched thin as it is. How do organizations continue to
keep up while minimizing mistakes, ensuring the results meet the user's needs
and staying within budget? The answer is automation and PowerShell.
$
are predictable. A department might
need a report at a particular time
each day; a business partner might
need the latest Excel spreadsheet
detailing new product specifications,
or a database might need to be
backed up in the cloud to a file.
Every file transfer has a trigger. That trigger can be ad hoc which
means the file is moved when an IT worker performs some action or it
can be automatic. In this article, we'll cover how to use PowerShell to
create scheduled tasks that will automate a file transfer script.
Copying files from one place to another is a one destination to another, PowerShell is a
trivial task no matter how you do it. And there great way to do that. Also, not only is it easy to
are a number of ways to get the job done: manually kick off PowerShell scripts, but you
dragging and dropping the file in Windows can also trigger transfers via PowerShell scripts
Explorer, Copy-Item with PowerShell or the by using Windows scheduled tasks.
simple copy command in DOS. It’s just a matter
of specifying a source and a destination path In this article, we’ll go over how to perform file
and setting a few other optional parameters. transfers using PowerShell by writing a script
It’s only when you start copying a lot of files on and creating a scheduled task to kick off that
a frequent basis that you run into problems. script on a recurring basis. But before we start,
You shouldn’t have to babysit all of the file I’m going to assume that you have at least
copies; scheduled tasks is perfect for PowerShell v4 installed on your computer.
automating this job. Otherwise, the tricks I’m about to show you
may not work properly.
When automating file copies, especially in a
Windows environment, your go-to scripting
language is going to be Windows PowerShell. If
you need to quickly copy one or more files from
You could create scheduled tasks by running This would register the script, and it will
the Task Scheduler GUI and creating one that now copy all files from your source to the
way, but we’re all about automation here. Let’s destination every day at 3 a.m.
In today’s dangerous cyber environment, it’s Encrypting data is always a good idea but it
more important than ever to protect your data. can be hard to manage, especially across
Bad guys are always on the lookout for an easy different servers and storage locations. By using
score. As a sysadmin, it’s one of your many jobs Microsoft’s built-in Encrypting File System (EFS)
to set up security controls and make sure your technology and PowerShell, the task of
network is not an easy target. encrypting and decrypting one, two or millions
of files across your data center can be a lot
One way to do that is to ensure your network easier.
perimeter is secured to prevent any
unauthorized access. However, what if your In this article, I’ll show you how you can
network is breached anyway? Perhaps manually encrypt and decrypt files with EFS
someone physically comes into your data using the GUI. Finally, I’ll go over some
center and steals a server to gather valuable PowerShell code that will allow you to perform
data you may have stored on it. If your data is this task over many different locations at once.
not encrypted, kiss it goodbye. But, if you had
the foresight to encrypt the data on that server
beforehand, while your data might still be
gone, at least you’ll know it won’t be read.
T File attributes
To decrypt:
O File is ready for archiving
Allow this fileto have contents indexed in addition to file properties
L
(Get-ChildItem –Path
C:\Documents).Encrypt()
OK Cancel Apply
Get-ChildItem C:\Documents
| Enable-FileEncryption
Get-ChildItem C:\Documents
| Enable-FileEncryption
It’s easy to copy files with PowerShell copy, that port is 445. This is a common port
Copy-Item via the command line. Once you that’s usually open internally, except in some
specify the source and destination location, it high-security situations or across a DMZ.
just happens. Unfortunately, many
administrators don’t think about how this
process occurs until it doesn’t work. Whether or PowerShell Copy-Item
not you think about this, all TCP network
communication (such as SMB file copies) use If you’re in a high-security environment or need
network ports to make the bits transfer. For a to transfer files from an internal network to a
file copy process to get a file from point A to DMZ that might have various port restrictions,
point B, a port needs to be open all the way to how can you ensure that your scripts are able
the destination node. In the case of an SMB file to copy files to nodes all the time? One way to
When you’re using PowerShell Copy-Item via Notice that you’re now using C:\ for the
the traditional SMB method, you need to destination rather than a UNC path. This
specify the Path and Destination parameters. command will accomplish the exact same
If you’d like to copy a file called file1.txt inside thing as your previous one, but it will use
of C:\Folder to a remote computer SERVER1 the session to encapsulate the file and
on its C:\, you could do this: transfer it via WinRM.
$session | Remove-PSSession
But what if SMB is blocked for some reason Copy-Item –Path C:\Folder1\file1.txt
or you’re using Invoke-Command to run –Destination ‘C:\’ –ToSession (New-
commands on SERVER1 anyway? You can PSSession –ComputerName SERVER1)
leverage PowerShell remoting sessions to
transfer the file over WinRM instead of SMB.
In order to do this, you must establish a new
remoting session and then pass the file That’s all there is to it! The next time you
over that session. find yourself in an environment where
PowerShell remoting is allowed but SMB
First, you should create a new PowerShell is restricted, or you’re already using a
remoting session. To do this, you can use remoting session for something else, you
the New-PSSession cmdlet and assign the can pass the session to Copy-Item to get
session to the $session variable. your file easily from point A to point B.
Don’t forget
to remove the
session when
you’re done!
101001101011001001
10100011010101110
11010111010
101001101001
10
01
When working with Microsoft Azure, you’ll account. I’ll be doing this via the
inevitably come to a point to where you need Set-AzureStorageBlobContent PowerShell
to access material stored locally on premise. cmdlet using the newer Azure Resource
This might be a virtual disk in VHD format to Manager (ARM) resources.
use for Azure’s IaaS service, a few PowerShell
scripts you need executing on your Azure The Set-AzureStorageBlobContent is available
virtual machines or maybe just some in the Azure PowerShell module, so you’ll need
configuration files for your Azure websites. to ensure you get this module downloaded
Regardless, for your Azure resources to access and available for use first. You’ll also need an
these files, they’ll need to be located in an Azure subscription as well as a storage account
Azure storage account. to store your files. In this example, I’ll assume
you already have a storage container
The Set-AzureStorageBlob pre-created.
To get started you’ll first need to authenticate You can see I’ve assigned this storage
your Azure subscription, which you can do container to a variable, allowing me to
using the Add-AzureRmAccount cmdlet. This quickly pass the object to the
will prompt you for a username and password, Set-AzureBlobStorageContent cmdlet. Once
granting you the token necessary to make I have the storage container, I then need to
changes to your Azure subscription. define the local file path and the destination
path. To do this, I’ll use these all as parameters
Once you’ve authenticated your Azure to the Set-AzureBlobStorageContent cmdlet.
subscription, you’ll need to specify a storage
account in which to create your Azure storage
blob. Your local files will automatically turn
$FilePath = 'C:\Users\Adam\MyFile.txt'
into blob storage once the file gets transferred
to Azure. To specify a storage account, you
can use the Get-AzureRmStorageAccount
cmdlet. Below, I have two storage accounts $BlobName = 'MyFile.txt'
available to me:
Get-AzureRmStorageAccount | $storageContainer |
select storageaccountname Set-AzureStorageBlobContent –File
$FilePath –Blob $BlobName
101001101011001001
10100011010101110
11010110
101001101001
Adam Bertram
Adam Bertram is an independent consultant,
technical writer, trainer, and presenter. Adam
specializes in consulting and evangelizing all
things IT automation mainly focused around
Windows PowerShell and Microsoft System
Center. Adam is a Microsoft Windows Cloud
and Datacenter Management MVP focused on
Windows PowerShell and has numerous Microsoft
IT pro certifications. He authors IT pro course
content for Pluralsight, is a regular contributor
to numerous print and online publications and
presents at various user groups and conferences.
You can find Adam at adamtheautomator.com
or on Twitter at @adbertram.
MOVEit Transfer
Thousands of IT teams depend
on MOVEit Transfer to secure
files at rest and in transit and
assure compliance.