VS 2010 Unable to automatically step into the server

If you need to debug a WCF Server, or somehow need to step into code on IIS, you will need to do the following to prevent the message  “Unable to automatically step into the server. Unable to determine a stopping location”. The message might be slightly different but generally it means you can't debug in IIS!

It turns out that Visual Studio 2010 comes with a default setting to debug ‘Just my code’.

To correct the problem, go to Tools – Options – Debugging – General and switch off the option ‘enable just my code’.  

And of course don’t forget to set debug="true" in the web.config file of the webservice. Double check your publishing Transformations too if using any, as that might be removing the debug="true" property in your web.Config file.

This will hopefully help someone else out there!

Calculate Current Year for Choice Column

If you need to display the current Year as the default value for a Choice column in a SharePoint List/Library then you can do so by using =TEXT(Today,"YYYY") in the Default Value box, after selecting "Calculated Value".

 

SharePoint is not searching the content.

If your Search Engine crawl is not returning any "Content" results from a Full Crawl, i.e. pdf, docx files are not being returned then you probably need to register the mssph.dll file. Note that PDF content searching will require Adobe iFilter.

Start-->Run

regsvr32 "C:\Program Files\Common Files\Microsoft Shared\web server extensions\14\BIN\mssph.dll"

Try Full crawl and it should work.

 

Oracle SQL Developer: Show Time with Date

By default, SQL Developer does not display the Time with the date for date columns when running a select statement.

You can change this in the Tools / Preferences dialog of SQL developer:

  1. Select Database / NLS Parameters in the tree view on the left.

  2. Then put dd/mm/yyyy hh24:mi:ss into the Date Format field.

  3. Press OK to close the dialog.

Enjoy!

PowerGUI does not work when running SharePoint Scripts

PowerGUI does not work when running SharePoint Scripts because by default the FOUR Config files have the Run times and reverse Order.

This is the default!

 

Switch them around like this and your SharePoint Scripts will now work.
You will need to open them in NotePad running as Administrator!

 

 

Hope this saves the day for someone.

Accounts used by application pools or service identities are in the local machine Administrators group.

The message can be ignored, or can it? http://technet.microsoft.com/en-gb/library/hh344223.aspx. It only states since if you use an account in local admin group it gives that account the right to execute malicious codes without even prompting to execute.

Add the farm account into the local administrators group. This is stated in the TechNet article:http://technet.microsoft.com/en-us/library/ee721049.aspx

"The Server Farm account, which is created during the SharePoint farm setup, must also be a member of the Administrators group on the server where the User Profile Synchronization service is deployed."

There seems to be some conflicting opinions about the correct permissions, as this will cause the SharePoint Health Analyzer to create a warning:

"Accounts used by application pools or service identities are in the local machine Administrators group. Using highly-privileged accounts as application pool or as service identities poses a security risk to the farm, and could allow malicious code to execute."

Also grant the Replicate Directory Changes [http://support.microsoft.com/kb/303972] permission for the farm account. Reboot the server to make sure that all the services using the farm account run with the new privileges.

Make this a point for site app pools:
As a SharePoint best practice, please refrain from using built-in machine administrator account for any SharePoint site app pools (services, application pools).

From the article on technet: http://technet.microsoft.com/en-us/library/cc678863(office.12).aspx (though the article is for MOSS 2007, it is still relevant)

The other application pool account must be a domain user account. This account must not be a member of the administrators group on any computer in the server farm.

Dealing with Blobs in Oracle, SQL and MS Access

I came across a challenge this week. Migrating Blob data from Oracle to SQL Server. I used MS Access for this since I could create an ODBC DSN to each and then have the best of ALL worlds. That is review the data, count the records and write very simple code. The link below is excellent and worked perfectly for what I needed for dealing with the Blobs.  

How To Read and Write BLOBs Using GetChunk and AppendChunk: http://support.microsoft.com/default.aspx?scid=kb;en-us;194975

The process I used for the Blobs was this:
1. Copy the data from Oracle to SQL exluding the Blobs Column.
2. Export the Blobs to File using the ID field as a key like (ID) & FileName. FileName was a column in my Oracle table so that was nice and pretty well needed so you can Identift the MIME (file type like XLS, DOC, PDF).
3. Finally Update the SQL Server table using File to Blob.

I had some other "things" in the code to deal with missing files from the Blob column but that was simple.
 

Move your SharePoint 2010 logs off of your C drive

There are two sets of logs you want to move, the diagnostic logs and the usage logs. An important note is that every machine in the farm must have the same paths for this to work. If one doesn’t have a D drive or something SharePoint will freak out.

Here are the steps:

Diagnostic logs:

Central Admin > Monitoring > Configure Diagnostic Logging (/_admin/metrics.aspx). The setting is the “Trace Log” path at the bottom. I strongly recommend only changing the drive letter. Leave the path alone. It’ll make it easier for you to find things later on. You can also use PowerShell to change this. The cmdlet is Set-SPDiagnosticConfig and the parameter is –LogLocation.

 

With PowerShell:

Set-SPDiagnosticConfig -LogLocation "E:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\LOGS”

Usage logs:

Central Admin > Monitoring > Configure web analytics and health data collection (/_admin/LogUsage.aspx). The setting is the “Log file location” setting. Set it to the same path you did the Trace Log above. Again, don’t get fancy and put it at something like “D:\SharePoint\Stuff\Things\LogsAreHiddenHere” The PowerShell cmdlet to alter this is Set-SPUsageService and the parameter is –UsageLogLocation.

 

With PowerShell:

Set-SPUsageService -UsageLogLocation "E:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\LOGS\"

How it looks:

 

Your disk savings won’t be crazy big, but every little bit counts. By way of comparison the Logs folder on this web server is taking up 3 GB

The Usage logs get removed once they’re parsed and the Trace logs in that directory go back to 12/19/2011. Your server will certainly have more traffic than mine does, so your logs will probably be larger.