Install dbatools module without admin rights

If you have been face a issue that you don’t have rights to install any powershell module like showing the image below for dbatools, one workaround is to install it only for your user.

In this case you can use the scope parameter to install it in your user only as you can see in the message.

Install-Module dbatools -Scope currentuser



How to add Active Directory powershell module

I’m going to show the steps how to add the AD module in your machine and in the server if you need to get information like run Get-ADPrincipalGroupMembership to know what groups a specific user belongs.

To add this module in your local machine is quite easy. Install the Remote Server Administration Tools for Windows 10 package

But, to install it in a Windows Server for example you need to add this feature following the steps .

  1. Open Server Manager Dashboard
  2. Click Manage -> Add Roles and Features Wizard
  3. Click next until to show the picture below in the item features

After install it (no restart needed) you can run import-module ActiveDirectory and run something like this:

Get-ADPrincipalGroupMembership -Identity 'user_name' | Select-Object name

$groups = Get-ADGroup -Filter {name -like 'user_group'} -Properties * | Select -property name

foreach($group in $groups) {
    if($group.name -ceq 'Group trying to find') {
        $group.name
        Get-ADGroupMember -Identity $group.name | Where-Object objectClass -Like 'group' | Select-Object name
    }
}

Sending files to AWS S3 using Powershell

Amazon has a Powershell module to manage the the principal services available. I’ve been working with EC2, RDS and S3 and I wrote a tip to fast copy data to S3 and I¬†created the function bellow to help to send files to S3.

I’m using the function bellow to send my backups to S3. It’s configured to send all files in the paths passed by parameter

Import-Module AWSPowerShell

# Author: Douglas Correa

function Send-S3Files {
 <#
 .SYNOPSIS
 Send the files from a local path to AWS S3
 .DESCRIPTION
 The function will copy all files from a specific path to AWS S3
 .EXAMPLE
 Send-S3Files -BucketName 'backups' -Region 'sa-east-1' -AKey '####' -SKey '####' -LocalSource 'c:\temp' , 'd:\backups' 
 #>
 [CmdletBinding()]
 Param (
       [Parameter(Mandatory=$True, ValueFromPipeline=$False, HelpMessage='Name of the bucket in AWS')][string]$BucketName
     , [Parameter(Mandatory=$True, ValueFromPipeline=$False, HelpMessage='Region used in AWS')][string]$Region
     , [Parameter(Mandatory=$True, HelpMessage='AWS access key')][string]$Akey
     , [Parameter(Mandatory=$True, HelpMessage='AWS secret key')][string]$SKey
     , [Parameter(Mandatory=$True, ValueFromPipeline=$False, HelpMessage='Local machine paths')][string[]]$LocalSource
 )

    process {

        Initialize-AWSDefaultConfiguration -AccessKey $AKey -SecretKey $SKey -Region $region

        foreach($source in $sources) {
            Set-Location $source
            $files = Get-ChildItem '*.*' | Select-Object -Property Name #get all files in the folder

            try {
            if(Test-S3Bucket -BucketName $bucket) {
                foreach($file in $files) {
                    if(!(Get-S3Object -BucketName $bucket -Key $file.Name)) {
                        Write-Host "Copying file : $file "
                        Write-S3Object -BucketName $bucket -File $file.Name -Key $file.Name -CannedACLName private -region $region
                    } 
                }
            } Else {
                Write-Host "The bucket $bucket does not exist."
            }
        } catch {
            Write-Host "Error uploading file $file"
            $Error
        }
    }
}

}