Here are the slides for my SharePoint Saturday Session.
I came across an issue at a client I was at and I was banging my head against the wall for a while. So I thought I should share. I was developing an HTTPModule that was being added to the web.config through a SharePoint feature. This part was working fine then all of a sudden the site starting throwing “500 Internal Server Error” messages. So I figured ok, I’ll just check Event Viewer for the machine. What did I find, nothing out of the ordinary. I tried a few other things including completely removing my HTTPModule and still no dice. So I tried resetting IIS, rebooting the machine, none of it worked. So I figured I would double check and make sure the module was removed.
So I went into IIS and selected the web site that I was working with. I then double clicked on the Modules option and was presented with an error “Cannot read configuration file because it exceeds the maximum file size”. So after a quick search on Bing I found that apparently ASP.Net has a limitation that the web.config file cannot exceed 250 KB. So I checked my config file and it was 257 KB. So I trimmed out some duplicated lines caused by another web.config modification from a feature and then the site came back up. So for anyone out there, hopefully this is helpful.
I was working at a client where we had to rebuild their search service to try and alleviate some issues that they were experiencing and as part of this I came to the realization that manually creating managed properties has got to stop. The problem is that there is no easy way to move service scoped properties from one server to another. Also as a developer, the fact that this is not easily deployed bugs me. I want to be able to deploy the properties consistently. I don’t want to have to go back and create them again and wait for another full crawl because I missed one or typed the name in wrong. What’s worse is there is no way of putting this configuration into version control to create a baseline configuration. Continue reading “SharePoint Search and PowerShell Automation Script”
I was working at a client doing a health assessment and during the assessment we talked about problems with users being able to turn on versioning. That being that each time a modification is made it keeps a copy of that file and if versioning is turned on then this can drastically increase your content database sizes pretty quickly. I setup this script to be able to help identify what the settings are. This script could be modified to enforce a policy on what the limits should be as well.
I found it strange the Project Server does not have a recycle bin for cases of accidental deletion of a project. One of my clients also found this strange and so I setup a project recycle bin. It works fairly simply by taking advantage of the archiving system of project server. This worked pretty well except that there is no way to specify how long a project will stay in the archive. So I figured the simplest way of accomplishing this was to setup a timer job that will access all of the projects in the archive and check for the custom description set when the project is sent to the “Recycle Bin”. It would then check when the project was deleted to see if it needed to be flushed from the archive. So far so good. Project server provides a method for programmatically deleting a project from the archive. The documentation seems fairly clear, it can be found at http://msdn.microsoft.com/en-us/library/office/gg204474.aspx. So I had my timer job call this method on the projects that needed to be removed. The documentation says to call the function passing the following parameters in this order:
- Guid JobUID
- Guid ProjectUID
- Guid ArchiveUID
So I used the following line of code.
archiveSvc.QueueDeleteArchivedProject(JobId, project.PROJ_UID, project.PROJ_VERSION_UID);
Well when ever the job ran nothing was getting removed. I pulled my hair out trying to figure out why this wasn’t working. Then I had an epiphany, maybe the parameters are reversed. So I created a test console app where I could try changing the values around. What I found was that if I reversed that last two parameters the project gets deleted as expected. So I changed the code in my timer job to:
archiveSvc.QueueDeleteArchivedProject(JobId, project.PROJ_VERSION_UID, project.PROJ_UID);
Now everything works as expected. So what happens is that you are passing the VersionID as the project ID and the Project ID as the Archive ID. Go figure.
Thank you everyone for a great day at SharePoint Saturday in Palo Alto and thanks to everyone that attended my session. As I said, here are the slides for my presentation. I am also including a link to a survey if you didn’t get a chance to fill one out. I tried uploading the raw slides but they are too big. If you would like a digital copy and cannot get it from SlideShare shoot me a message and I will send it to you. Let me know if you have any questions.
I ran into a problem with a client where the Site Collection’s user list was missing the email addresses from most of the users so our workflows were not working. Turns out that SharePoint 2010 and 2013 have different setup for the defaults of the User Profile store. In our 2010 environment we went with the out of the box setup for the profile map to AD and so we did the same in setting up 2013. In 2010 the work email field was mapped to the mail property in AD. In 2013 it was mapped to the proxyaddress property. Once we corrected that we needed to refresh the users list. I found an entry that was written for 2010 but the information is still valid for 2013. I thought I would link and share this in case anyone else has the same problem or in case I need to remember how to do this again.
Essentially there are two timer jobs that will sync the user profile data to the sites. They are the User Profile Service Application – User Profile to SharePoint Full Synchronization(Hourly) and User Profile Service Application – User Profile to SharePoint Quick Synchronization(Every 5 Minutes). The article also mentions a couple stsadm commands that can help if the jobs are failing.
Stsadm.exe –o sync –listolddatabases 0
This will display all the databases and when they were last synced.
Stsadm.exe –o sync –deleteolddatabases 0
This command will delete the old information as if no sync had ever happened. You can then run the full synchronization job to update all the sites.
Here is the source article if you would like more information.
Thanks for an awesome event. I have posted the slides to slideshare since they are rather large. Display templates are attached to this post.
I am working on getting the video onto YouTube so stay posted. It does look like however that my camera ran out of space near the end of the presentation.
Feel free to contact me if you have any questions.
Gotta love undocumented features. I have come to learn thanks to Sharepoint 2013 that for both the showModalDialog function and the showWaitScreenWithNoClose function you do not need to specify a height and width. If you do not provide these values SharePoint will automatically size your content for you.
I have been taking advantage of this “feature” for modal windows for quite some time. In the process of upgrading some web parts to SharePoint 2013, the larger fonts were causing problems with my wait messages. So I thought, why don’t I try and remove the height and width and see if SharePoint will handle it like it does modal windows. I removed the last two parameters and tried again and it worked beautifully.
For example change this
SP.UI.ModalDialog.showWaitScreenWithNoClose(“Processing”, “Your request is being processed. Please wait while this process completes.”, 100, 200);
SP.UI.ModalDialog.showWaitScreenWithNoClose(“Processing”, “Your request is being processed. Please wait while this process completes.”);
and SharePoint will handle the sizing making it easier to deal with variable content size. Even on MSDN the parameters are not marked as optional. Hopefully they don’t pull this in the future.
I decided this week to change my demo from running on my laptop with its limited RAM and move it to Amazon’s cloud. Amazon has a pretty sweet setup where you can pay by the hour for the server time that you need. I now have a server that can handle the requirements of SharePoint 2013 without as many headaches as I had with my laptop. I looked at using cloudshare when I was prepping my demo for SharePoint Saturday Utah but I didn’t feel like paying that much for a whole month when I only needed the server to be running for a few hours.
I have the server setup after some fun oauth issues in SharePoint 2013. Now I have the fun part of transferring my demo content to this new server.
For those that are thinking of trying to run SharePoint 2013 on your own laptop. If you only have 8GB of RAM or less do not go there. You can get it working but all of the required pieces on one machine is too much. In order to do my last demo I had to turn off a bunch of services to free up memory. I had to severely trim the services for Search and the sad part was that my presentation was on search. So I had it crawl the content I needed then I turned off the crawler and indexer to free up the memory. Luckily no-one asked for live search results or changes. Now with this server that I can turn on whenever I want and only pay for the time that it is running I no longer have this problem. I love technology.