Blogging continues at blog.predica.pl!

•October 10, 2012 • Leave a Comment

Hey Everyone,

As you have noticed, this blog has been idle for over two years! If I ever get the time and energy to start blogging again, you will here me here:

http://blog.predica.pl/ -> I definitely encourage you to visit and bookmark this site if you are into Identity & Access Management or SharePoint. Our Predica team is blogging and I hope to join them soon :)

How to disable RBS in Sharepoint 2010

•June 19, 2010 • 8 Comments

Continuing the struggle with Remote Blob storage and Sharepoint. I have decided against using it (for now) – why? Management of it is a pain, and so for now I just use dedicated SQL databases for site collections (yes you can do this with MOSS2010). RBS in my virtualized scenario did not make much sense, since I was still utilizing the same physical disks, just moving content from SQL to NTFS, which for v. large content db would make sense, but with MOSS2010 the recommended max is now 200GB. So use RBS only if you have DBs growing near above 200GB and you have separate (usually slower/cheaper) storage disks for your RBS filestream.

But how to disable RBS on your sharepoint 2010 content DB? I found all the steps just as difficult to figure out as for installing. Below are my findings on how I managed to disable R?BS and remove it completely of my sharepoint (I had 1 content db using it).

Backup 1st: Backup site collection with stsadm, backup SQL db, backup RBS blob storage (NTFS – copy when SQL server service is stopped).

Migrate all content off RBS to SQL and disable RBS for content db:

$cdb=Get-SPContentDatabase <ContentDbName>

$rbs=$cdb.RemoteBlobStorageSettings

$rbs.GetProviderNames()

$rbs.SetActiveProviderName("")

$rbs.Migrate() –note: this might take some time depending on amount of data in your RBS store

$rbs.Disable()

Change the default RBS garbage collection window to 0 on your content db:

exec mssqlrbs.rbs_sp_set_config_value ‘garbage_collection_time_window’,’time 00:00:00′

exec mssqlrbs.rbs_sp_set_config_value ‘delete_scan_period’,’time 00:00:00′

Run RBS Maintainer (and disable the task if you scheduled it):

"C:\Program Files\Microsoft SQL Remote Blob Storage 10.50\Maintainer\Microsoft.Data.SqlRemoteBlobs.Maintainer.exe" -connectionstringname RBSMaintainerConnection -operation GarbageCollection ConsistencyCheck ConsistencyCheckForStores -GarbageCollectionPhases rdo -ConsistencyCheckMode r -TimeLimit 120

Uninstall RBS:

On your content DB run: exec mssqlrbs.rbs_sp_uninstall_rbs 0

Uninstall from add/remove SQL Remote Blob Storage.

I found that there were still filestream references in my DB, so run this on your content DB:

ALTER TABLE [mssqlrbs_filestream_data_1].[rbs_filestream_configuration] DROP column [filestream_value]

ALTER TABLE [mssqlrbs_filestream_data_1].[rbs_filestream_configuration] SET (FILESTREAM_ON = "NULL")

Now you can remove the file and filegroup for filestream:

ALTER DATABASE yourdbname Remove file RBSFilestreamFile;

ALTER DATABASE yourdbname REMOVE FILEGROUP RBSFilestreamProvider;

Last step: Disable filestream in SQL Configuration Manager for your instance (if you do not use it anywhere aside this single content db with sharepoint), run SQL reset and iis reset and test.

You will find those links useful:

http://sqlrbs.codeplex.com/Thread/View.aspx?ThreadId=204627

http://technet.microsoft.com/en-us/library/ff628255.aspx

http://technet.microsoft.com/en-us/library/ff628259.aspx

http://beyondrelational.com/blogs/jacob/archive/2010/03/11/completely-removing-filestream-features-from-a-sql-server-2008-database.aspx

Sharepoint 2010 and SQL Remote Blob Storage Issue

•May 29, 2010 • 5 Comments

Have you tried enabling RBS on MOSS 2010? There are a bunch of blog/technet articles on how to do this. Its a real pain (administrator wise), but it can be made to work. Two things to watch out for:

1. Garbage collection. Its not done out of the box, you need to configure the RBS maintainer. See: http://blogs.msdn.com/b/sqlrbs/archive/2010/03/19/running-rbs-maintainer.aspx, http://blogs.msdn.com/b/sqlrbs/archive/2008/08/08/rbs-garbage-collection-settings-and-rationale.aspx and the RBS chm help file. You should set that up in the task scheduler. And as far as I found out there is a 30day period where files are still not deleted – kind of retention period which for backup strategy makes sense. Its hard to check – documentation on details is scarce and the rbs_ sql tables are not that obvious to read.

2. Files larger than ~1.2MB will fail to upload IF you did not allow client access to FILESTREAM. You will see “access is denied” SqlRemoteBlobs.RemoteBlobStoreException. So go in and enable client access (I tried setting Share permissions on this but that did not work out). This is what you should have in SQL Configuration Manager for that instance:

image

This is because by default 1.2MB is the limit from which SQL starts out of band access to the blobs (ref: http://blogs.msdn.com/b/sqlrbs/archive/2010/03/31/rbs-filestream-provider-small-blob-optimization-settings.aspx).

It seems not many people use this feature, but if you start having 100’s GB content your SQL will quickly die without RBS…  On the other hand RBS+Sharepoint integration is weak (close to none): its hard to setup, hard to maintain and backup/restore gets even more complex – so I suggest do it only on your largest content DBs and set a limit from which you move the files out to RBS, by powershell (example for 512kB):

$cbd = Get-SPContentDatabase “WSS_Content”

$cbd.RemoteBlobStorageSettings.MinimumBlobStorageSize=524288

$cdb.Update()

I have tons of other internal small KB articles like that one developed during my works, so ping me a mail if you have some strange issue and google is unable to answer.

I do have 1 unanswered problem though:

When I upload an Office 2k7 document with custom server properties from another sharepoint library (e.g. 2007) to my new moss2010 farm, I see an issue that an event handler we have is not fired. All other docs (non Office 2k7) work fine, and as soon as I remove server properties from that doc the event handler is fired… I have no idea if there is a switch somewhere in SharePoint 2010 that would resolve this – seems MOSS2010 is trying to read those server properties and this is causing some issues in our custom developed code:( debugging also doesn’t help as the event is simply not fired. ULS logs are silent on this. If you hit this or saw a similar issue let me know!

“The given key was not present in the dictionary” issue in Sharepoint 2010 RTM

•May 13, 2010 • 8 Comments

Lately I hit a very strange issue while configuring Sharepoint 2010. It installed fine on Windows Server 2008 R2 with local SQL Server 2008 R2 instance. However I had strange errors in central administration, e.g.:

  • “The given key was not present in the dictionary” error when attempting to run the Configure Farm Wizard
  • Moreover it seemed as though the same user had different permissions in central admin when browsing it locally on server or remotely!
  • The error  “The given key was not present in the dictionary” repeated itself while attempting to configure Sharepoint SQL reporting integration/

The Sharepoint logs (in “14” hive) showed that it seemed this method failed:

GetUserPropertyFromAD(SPWebApplication webApplicaiton, String loginName, String propertyName)

So I thought its an AD permission issue, and it was! I had to grant AUTHENTICATED USERS READ permission (probably read ‘some’ information would be enough) to the MOSS accounts (I granted to the DB access account and managed services account). After that an IISRESET and it worked.

Again Microsoft could improve its error messages – nothing new ;D

VHD Native Boot – some gotchas to be aware of [update]

•February 18, 2010 • 1 Comment

VHD native boot – the nice new feature in WS08R2 and Win7 that allows boot from VHD. It works, you can even boot from a VHD on USB external drive, which is nice, since normally Win does not allow boot from external USB (treats it as REMOVABLE and won’t install).

However there are 2 issues I met that were undocumented on the web:

  1. You cannot perform a BARE METAL backup of a WS08R2 machine running on VHD. I needed to do this to transfer a VHD from 1 hardware to a different one. Ooops – no option to do that:( And since this is a full fledged VM (SQL, Sharepoint etc) Sysprep is no option.
  2. If you want to move a native boot VHD to Hyper-V you need to:
    1. Mount the VHD, open the HKLM hive from that VHD in regedit (LOAD HIVE)
    2. go to CCS\Services\Intelide and change Startup to 0 (start on boot)
    3. Only then you will be able to boot a VM running on hyper-v with the VHD you used for native boot.

One more interesting thing my friend, Jakub, pointed me to: if you want to move the vhd from 1 hardware to another remember this:

bcdedit /set {guid} detecthal on (Refer to: http://technet.microsoft.com/en-us/library/dd799299(WS.10).aspx)

For most other issues you will find useful links on the web, e.g:

http://blogs.msdn.com/bramveen/archive/2009/10/27/swapping-between-native-vhd-boot-and-hyper-v.aspx

http://www.markwilson.co.uk/blog/2009/10/native-vhd-boot-windows-7-or-server-2008-r2-from-an-external-usb-drive.htm

Nice link about doing vhd boot without any OS (note you can use normal windows 7/R2 installation dvd instead of WinPE): http://blogs.msdn.com/mikeormond/archive/2009/10/09/boot-windows-7-from-vhd-without-installing-a-native-os.aspx

And a step-by-step video: http://www.ditii.com/2009/10/19/native-boot-windows-7-windows-server-2008-r2-from-vhd-on-a-windows-xp-pc/2/

Any my friends step by step for booting vhd off usb (in PL though): http://app-v.spaces.live.com/Blog/cns!12E9A21E4AEEFADB!275.entry

SEARCH TAGS: bare metal backup native vhd boot, native vhd boot does not start in hyper-v

Sharepoint, Kerberos and Internet…

•January 22, 2010 • 4 Comments

Lately I’ve been doing some work around Sharepoint solutions. Our company (www.predica.pl) focuses on business applications built on top of the Microsoft Sharepoint+SQL BI platform delivered to customers through Internet in the cloud computing model. This is still (at least in Poland) a novelty and there are many challenges. Not only technical, but also business – its not easy to find customers for those kind of advanced IT solutions. But since my blog is supposed to be at least a bit technical let me focus on that first part:0

In our team I’m also the ‘infrastructure guy’, which means that I forgot how to write code and now have to do the boring, dirty stuff;) Anyway if you are configuring a platform that uses IIS, Sharepoint, SQL DB, Reporting, Analysis and want to expose that to Internet you will be faced with many challenges – regardless if you are a programmer or an “ITpro/admin”. This Microsoft platform is powerful in its potential but still developing on those aspects like ease of programming or installation/configuration/maintenance/operations.

Anyway, let’s get to the point: you will find lots of useful information on the internet about how to configure Kerberos and Sharepoint. However most of them focus on INTRANET scenarios. Here I want to give some short tips/pointers on the most often “hicups” when exposing a solution built of varying technologies (Sharepoint Excel Services and Reporting Services, SQL Analysis Services/OLAP) over Internet. My whole list of various issues I met along the way is huge, but let me give you the top 3, which caused my biggest headaches (least or not documented anywhere in www.google.com ;)

  1. If you expose Sharepoint web apps to internet you cannot use Kerberos on those sites – why? because you can either use forms based authentication or windows, but NTLM (your user out there in the Internet won’t find your internal DC and even if he would, he would not get through all the firewalls with his Kerberos ticket). But if you will want (or need, as shows point 2) to delegate user credentials to the backend, you will need to configure CONSTRAINED DELEGATION, where first hop is NTLM (or any other auth) and subsequent are KERBEROS. Here is the must read:http://www.adopenstatic.com/cs/blogs/ken/archive/2007/07/19/8460.aspx.
  2. If you want to expose OLAP cubes over internet, there is a way with the magical msmdpump.dll. Two things to watch out for:
    1. you need to use this scheme of authentication: basic auth to IIS website with msmdpump.dll, then Kerberos to the backend OLAP (which is point 1)
    2. If you configure the above read http://blogs.msdn.com/psssql/archive/2009/04/03/errors-may-occur-after-configuring-analysis-services-to-use-kerberos-authentication-on-advanced-encryption-standard-aware-operating-systems.aspx, and install the hotfix mentioned there on ALL your farm and backend servers. Otherwise you will notice strange “message altered in transit” errors and OLAP will not work correctly. This applies to both OLAP over internet (msmdpump.dll) as well as OLAP over Excel Services
  3. There is a problem if you use the same EXCEL to connect to OLAP cube over internet (pointing to the msmdpump.dll) and still want to VIEW it in browser by using Excel Services: that will fail because of how excel services security model is currently implemented. Don’t want to go into details on that, but MSFT is aware of this and is “discussing on whether to fix it or not” (the usual;)) The only workround it to dynamically in code substitute the EXTERNAL .odc connecting to msmdpump.dll to an INTERNAL .odc that connects directly to Analysis Services when querying the Excel via Excel Services. Not nice, but otherwise you cannot view the same Excel with PivotTable in ECS and use it on the internet.

To get all this right its important to understand Kerberos, and Internet is where you will find more then enough info on this. For troubleshooting I found these tools useful:

  • Netmon (of course)
  • Fiddler  (‘lightweight http netmon’;))
  • Sharepoint logs
  • IIS failed request logging
  • and of course www,google.com

If you have any questions, or some feedback (other big issues you met in such setups) ping me by mail or using this blog (comments/contact).

TFS2010 Beta2 installation

•October 25, 2009 • Leave a Comment

Yesterday I installed Visual Studio Team Foundation Server 2010 Beta2. I had some issues (otherwise I wouldn’t have mentioned it on my blog –right?;))

First off, I had this error, even though setup reported no errors. I could not administer the TFS (e.g. administer security). I received 500 Internal server error. Also when trying to browse the /TFS website. I spent some time trying to figure it out, and then went for the ‘easy’ solution. Set it up on a brand new server (in my case a VM): WS2008SP2 x64. I did NOT add web service role to the server, I just let the TFS install do it. I needed the basic installation only (no sharepoint or reporting, just source control and build). This time it worked:)

I figure that this could have been due to some settings on my IIS7 on the previous server, since it was also hosting other services (WSUS and Visual SourceSafe).BTW: if you are co-hosting VSS and WSUS on a 64 bit II7 you will need to turn off dynamic compression in IIS (see: http://forums.iis.net/t/1149768.aspx) – I spent a whole day troubleshooting this!

TFS2010 for now seems to work great for us. Its way better than VSS for internet-based source control, and easier to set up then TFS2008. Some of the things to note: you will still need Team Explorer 2010 to create team projects, but you will be good to go using Team Explorer 2008 for most tasks, just remember to apply Visual Studio 2008 SP1 and this GDR update http://go.microsoft.com/fwlink/?LinkId=166481 on it. And be careful: by default TFS2010 (that was true for 2008 too) allows shared checkouts – this can lead to some problems during merges, so the way to disable it is in team explorer (on a collection or project level) by choosing collection settings and source file types (this is a per filetype setting).

 
Follow

Get every new post delivered to your Inbox.