Dave's Technophorical Times

A blog about Microsoft's Technologies!
SharePoint :: MVC :: ASP.NET :: IIS :: SQL Server :: Visual Studio :: MS Access

Usually you will see this if the user does not have rights to the SharePoint config database, or if the .NET 4.X framework is not installed.

All the Best
Dave



Here is a script that I wrote for backing up some very important files to an off-site ftp location. It backs up only files that are new OR have changed in the last 24 hours. That number of days could be a parameter based on when the backup last ran, however this is the base code that you can start using right away. Just add it to a Task Schedule that runs one a day.


# ==============================================================================================
#Set the Date/Time
# ==============================================================================================
$BackUpdateTime  = (Get-Date).Year.ToString()
$BackUpdateTime += (Get-Date).Month.ToString()
$BackUpdateTime += (Get-Date).Day.ToString()
$BackUpdateTime += (Get-Date).Hour.ToString()
$BackUpdateTime += (Get-Date).Minute.ToString()
$BackUpdateTime += (Get-Date).Second.ToString()
$today = (Get-Date -Format yyyy-MM-dd)

try {
 $ftp = "ftp://ftp.mysite.ca/"
 $user = "ftpuser"
 $pass = "ftppassword" 
 
 $webclient = New-Object System.Net.WebClient
 $webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass) 

 #we specify the directory where all files that we want to upload 
 $Dir="Y:\LocalDirectory\"
 $LogFile="I:\PowerShell\MyBackup_"+$today+".txt"
 Clear-Host
 #Write-Host $LogFile
 "From:"+$Dir+" (on server01) To:"+$ftp | Out-File $LogFile -Append
 "Start: "+(Get-Date) | Out-File $LogFile -Append

 $files = @(Get-ChildItem -Path  $Dir -Recurse | ?{ !$_.PSIsContainer } |Where-Object { $_.lastwritetime -gt (get-date).AddDays(-1)} | Select-Object -ExpandProperty FullName )
 foreach($item in $files)
 {
     if($item -ne $null)
  {
   $uri = New-Object System.Uri($ftp+$item.Substring(3))
   $webclient.UploadFile($uri, $item)
   #Write-Host (Get-Date)$item
   "$(Get-Date): "+$item | Out-File $LogFile -Append
  }
 }
 $webclient.Dispose()

 "End:"+(Get-Date) | Out-File $LogFile -Append
 
 $msg = new-object Net.Mail.MailMessage
 
 # Edit the From Address as per your environment. 
 $msg.From = "Backup (server01) <my.email@mysite.ca>"

 # Edit the mail address to which the Notification should be sent. 
 $msg.To.Add("my.email@mysite.ca")

 # Subject for the notification email. The + “$today” part will add the date in the subject. 
 $msg.Subject = "Backup was Successful for " + "$today"

 # Body or the notification email. The + “$today” part will add the date in the subject. 
 $msg.Body = "Backup was Successful for " + $today + "`r`n`r`n" 
 
 $att = new-object Net.Mail.Attachment($LogFile)
 $msg.Attachments.Add($att)
 
 # IP address of your SMTP server.
 $smtpServer = "smtp.mysite.ca" 
 $smtp = new-object Net.Mail.SmtpClient($smtpServer) 

 $smtp.Send($msg)
 $msg.Dispose()
}
Catch { 
 $ErrorMessage = $_.Exception.Message 

 # Configure the below parameters as per the above. 
 $msg = new-object Net.Mail.MailMessage
 $msg.From = "Backup (server01) <my.email@mysite.ca>" 
 $msg.To.Add("my.email@mysite.ca") 
 $msg.Subject = "Backup Job failed on " + "$today" 
 $msg.Body = "Job failed on " + "$today and the reason for failure was $ErrorMessage." 
 
 $att = new-object Net.Mail.Attachment($LogFile)
 $msg.Attachments.Add($att)
 
 $smtpServer = "smtp.mysite.ca" 
 $smtp = new-object Net.Mail.SmtpClient($smtpServer) 
 
 $smtp.Send($msg)
 $msg.Dispose()
}



The files that you attach to an email become locked until the instance of Powershell you are running has exited completely. So if you run a script through Powershell ISE that attaches a file to an email, that file will remain locked until you exit Powershell ISE.

If the file is locked by Powershell, you will get an error/warning message similar to the following if you try to modify it in any way;

The process cannot access the file 'c:\filename.txt' because it is being used by another process

By using the following command, you can ensure that Powershell 'disposes' of the email message once it has been sent and does not continue to 'lock' any files you attach and send via email;

$mailmessage.dispose()

Note: this is assuming that $MailMessage = New-Object system.net.mail.mailmessage



Here is some code for exporting your Outlook Notes.

Sub NotesToRTF()
    Dim myNote As Variant
    Dim cnt As Integer
    Dim noteName As String
    Dim strExportFolder As String
   
    strExportFolder = "C:\Notes-RTF\"
   
    If Dir(strExportFolder, vbDirectory) = "" Then
        MkDir strExportFolder
    End If
   
    Set myNote = Application.GetNamespace("MAPI").PickFolder
    For cnt = 1 To myNote.Items.Count
        noteName = Trim(Replace(Replace(Replace(Replace(myNote.Items(cnt).Subject, "/", "-"), "\", "-"), ":", ""), vbTab, ""))
        Debug.Print noteName
        myNote.Items(cnt).SaveAs strExportFolder & noteName & ".rtf", OlSaveAsType.olRTF
    Next
   
    Shell "C:\WINDOWS\explorer.exe """ & strExportFolder & "", vbNormalFocus
End Sub

Sub NotesToText()
    Dim myNote As Variant
    Dim cnt As Integer
    Dim noteName As String
    Dim strExportFolder As String
   
    strExportFolder = "C:\Notes-Text\"
   
    If Dir(strExportFolder, vbDirectory) = "" Then
        MkDir strExportFolder
    End If
   
    Set myNote = Application.GetNamespace("MAPI").PickFolder
    For cnt = 1 To myNote.Items.Count
        noteName = Trim(Replace(Replace(Replace(Replace(myNote.Items(cnt).Subject, "/", "-"), "\", "-"), ":", ""), vbTab, ""))
        Debug.Print noteName
        myNote.Items(cnt).SaveAs strExportFolder & noteName & ".txt", OlSaveAsType.olTXT
    Next
   
    Shell "C:\WINDOWS\explorer.exe """ & strExportFolder & "", vbNormalFocus
End Sub

Hope it helps someone else out there.



When you install a new farm, you might be curious to find out what the Health Analyzer results are. Instead of waiting for the checks to happen (basically timer jobs scheduled at different intervals), you could force them to run all at one go, so you can review the results and address them.

You can easily run this in PowerShell. All it does is start all the timer jobs with ‘Health’ in its name. That fires off all SharePoint 2010 Health Analyzer jobs for your new farm in one go.

# Check to ensure Microsoft.SharePoint.PowerShell is loaded
$snapin = Get-PSSnapin | Where-Object {$_.Name -eq 'Microsoft.SharePoint.Powershell'}
if ($snapin -eq $null) {
  Write-Host "Loading SharePoint Powershell Snapin"
  Add-PSSnapin "Microsoft.SharePoint.Powershell"
}
Get-SPTimerJob | Where {$_.Name -like "*Health*" -and $_.Name -like "*-all-*"} | Start-SPTimerJob


QUESTION: I'm using the SharePoint 2010 Health Analyzer and I've noticed that on some of the issues, when I open the Review window either or both of the Actions or Reanalyze Now options are grayed out. The pop up window says I may not have the right permission level to use this, might need to select an dobject or item, or the control might now work in this context. Can anyone explain why I wouldn't be able to do a reanalyze now? It seems strange this would be grayed out at all.

ANSWER:

Not only was the reanalyze greyed out but non of the scheduled tasks were running. I was able to trace it down to the account that was used to run "SharePoint 2010 Timer" service was not a local admin. I changed it to an account that is a domain service account that has local admin rights and had the rights outlined below:

1.Verify that the user account that is performing this procedure is a member of the Administrators group on the local computer.

2.Click Start, click Administrative Tools, and then click Services.

3.Right-click Windows SharePoint Services Timer V4, and then click Properties.

4.On the Log On tab, confirm that the account being used is a domain user account and is a member of the following:

dbcreator fixed SQL Server server role
securityadmin fixed SQL Server server role
db_owner fixed database role for all databases in the server farm

 Here is the Microsoft link I used.

http://technet.microsoft.com/en-us/library/ee924649.aspx

This resolved my issue. I hope this helps.



In Central Administration, if you visit Upgrade and Patch Management, Review Database Status and notice some DBs have the status ‘Database is up to date, but some sites are not completely upgraded.’ follow the procedure below to rectify.

If you are attempting to upgrade the Admin Content Database then Get-SPContentDatabase will not work, or at least it didn’t for me so refer to the extra steps at the end of this article.

The upgrade status reflects the current state of the database and hence this is not resolved via PSConfig but  rather by PowerShell.  If you need to perform this on a production or any other important environment, make sure you have an agreed maintenance window plus backups before beginning!

  • Using the farm account or equvilently privileged user, start the SharePoint 2010 Management Shell
  • Enter $DB = Get-SPContentDatabase -Identity [ Your Database Name]
  • Now enter Upgrade-SPContentDatabase -id $DB
  • You’ll now be prompted to confirm, just hit enter to accept the default of Yes
  • The upgrade will begin and a percentage complete will be displayed

When the process completes, return to Manage Databases Upgrade Status in Central Admin and the status will not be No action required.

Upgrading the Admin Content Database

As stated at the beginning of this article, Get-SPContentDatabase will not work with the Admin Content Database so another method needs to be found.  As long as you know the DB GUID, you can still execute Upgrade-SPContentDatabase so a simple query of the Config database could be used to find this.  However a better solution is to use Powershell.

The following script iterates though all Content DBs for a given Web Application, so can be used to upgrade the Central Admin Content Database.  You will need to the replace <WA URL> in the script with your Central Administration URL.

$wa = Get-SPWebApplication -Identity "<WA URL>"
foreach($ContentDB in $wa.ContentDatabases)
{
  Upgrade-SPContentDatabase -id $ContentDB
}



Rename a chart

When you create charts, Microsoft Office Excel assigns a default name to each chart by using the following naming convention: Chart1, Chart2, and so on. However, you can change the name of each chart to make it more meaningful to you.

  1. Click the chart that you want to rename.

This displays the Chart Tools, adding the Design, Layout, and Format tabs.

  1. On the Layout tab, in the Properties group, click the Chart Name text box.

 Tip   You may have to click the Properties icon in the Properties group to expand the group.

  1. Type a new name.
  2. Press ENTER.


The ItemUpdated event does fire twice - by design. Once when the item is actually updated, and then again when SharePoint is done checking in the item.
See http://www.simple-talk.com/dotnet/.net-tools/managing-itemupdating-and-itemupdated-events-firing-twice-in-a-sharepoint-item-event-receiver/ for more info on that. 
Some people use a workaround like this:
if (properties.AfterProperties["vti_sourcecontrolcheckedoutby"] == null &&    
              properties.BeforeProperties["vti_sourcecontrolcheckedoutby"] != null)
{
           //This is when the update event is triggered by check-in.
}
else
{
          //This is triggered by events other than check-in action.
}


The default file name given to your solution will match the project name.  I.e. a SharePoint project SharePointProject1 will result in a package SharePointProject1.wsp.

If you want VS to spit out a different name, modify the name attribute in the Package\Package.package file for your project.

To open the file mentioned in the above post, use Notepad as Visual Studio will automatically detect the .package extension and open the associatied template file and NOT the .package file itself.

Also be sure to remove the write protection from the file otherwise Notepad cannot overwrite it.



The Blogger

Dave Stuart I'm a Developer with a passion for coding. I enjoy the challengers that come with the job! SharePoint is one of my expert areas along with .NET Web Development with MVC and good old MS Access VBA coding. I Blog so that I can remember how I did that way back when; PLUS all this stuff is searchable! I constantly study and run my own business, Dafran Inc. I have passed 22 Microsoft Exams since 1998 when I first jumped on the treadmill of knowledge. I hope that you enjoy this Blog as much as I enjoy updating it. All the very best from Calgary, Alberta, Canada. contact me at linkedin @ dafran.ca

Calendar

<<  March 2020  >>
MoTuWeThFrSaSu
2425262728291
2345678
9101112131415
16171819202122
23242526272829
303112345

View posts in large calendar

Sign in