Sorry for the long delay in posting my summary of PASS Summit 2012 activities, but I needed to enjoy the Thanksgiving break off from work and travel.
I left off on my previous post on Monday afternoon before the Networking Event that took place around dinner time. The Network Event has been done for many years now by Steve Jones (b|t) and Andy Warren (b|t) before PASS Summit starts and is always a good event to meet lots of people. For me I had overdone it earlier in the day so I wasn't feeling very well, so sorry if you tried to meet with me and I didn't make it around to talk to you. Thankfully the issues I was having passed later in the evening, but I wasn't sure what was going on for awhile that evening.
Tuesday was the first day that I had Chapter Leader meetings to attend throughout the day. First was the meeting with all SQL Saturday organizers where we had many discussions on where we are with SQL Saturdays and where we want to take them in the future. Lots of very helpful information that we will be using for our next SQL Saturday on 9/21/2013! After lunch was the Chapter Leaders meeting where we split up into multiple different groups to talk about specific issues around starting/running/extending our User Groups. Again lots of great info that I will be looking forward to implementing in our group.
Tuesday evening was the Welcome Reception put on by PASS. As usual I was a little bit late getting to this event again this year, so I missed the First Timers entering the party (maybe next year I need to be a mentor to force myself to be there when this happens). The party was big as it has been over the past couple of years, was still able to get to the food and drink without too much waiting though. After meeting quite a few people I headed over with a few of the Pragmatic Works guys to the Volunteer Party at The Garage. Even though I had paid for the Exceptional DBA Awards put on by SQLServerCentral.com & Red Gate, I wanted to attend the Volunteer Party since this was the first year I have been invited to it (hate it when there are multiple parties going on at same time). Had a great time bowling and watching the election coverage.
Wednesday was finally the first official day of the Summit. It started off with Keynotes given by the current PASS President, Bill Graziano (b|t) and various Microsoft representatives. The biggest announcement that I picked up on are the changes to allow Multi-Dimensional cubes to be queried by DAX along with MDX, this puts Multi-Dimensional cubes on par with the newer Tabular models. Can't wait to see how this is fully implemented in vNext of SQL 2012 (this will not be included with SQL 2012 SP1 which was also announced). Also announced were changes to the ColumnStore Indexes to make the updateable and able to be clustered, should add some pretty cool new capabilities. The last thing that I remember from that Keynote was the Hekaton announcements, which is an in-memory OLTP solution that is supposed to add many X improvements in speed for most database solutions (assuming you have tons of memory on your servers). There were other announcements at the Thursday Keynote, but I didn't attend much of that one and heard from others that some of the demos were a bit long, lots of stuff around Big Data/Hadoop.
As with previous years I didn't attend a bunch of sessions during Summit, I end up spending a bunch of time networking and volunteering at the PASS Community Zone and spending some time at the Pragmatic Works booth. It was cool to see the Summit from the perspective of an exhibitor this year, that was the first time I got to see it that way. I did attend a session given by Mark Stacey (b|t) on using NASA data with PowerPivot, which was really cool because he built the whole session around how to get and use the free NASA and ESA data that is out there in demos. Being a Space Nut this was right up my alley and gave me some possible great ideas for future presentations (thanks Mark!). I also saw a great presentation on setting up Kerberos with SharePoint, which as anyone that has done knows this is a real pain point and Chuck Heinzelman (t) laid it out in 7 steps, should help if I do end up running into this in the future. I was able to get to one of the Lightning Talk sessions (yet again in way too small of a room again this year) and saw Bill Fellows (b|t), Jes Borland (b|t), Niko Neugebauer (b|t) and Oliver Engels (t) present on some great topics. As with most conferences with any BI sessions there was a BI Power Hour at Summit again this year and as with the previous ones this did not disappoint in presenting more solutions with Microsoft BI that you probably never would have thought it could do. Great job by Matt Masson (b|t), Chuck Heinzelman (t), Peter Myers, Matthew Roche (b|t), Sean Boon (b|t) and of course Patrick LeBlanc (b|t). I have been to at least 6 different BI Power Hours over the years at various conferences and they are still by far the funniest session to go to. The last session that I remember going to was on the details of the changes to allow for DAX on the Multi-Dimensional cubes, sounds like they did a good job adding these changes in the right way, including no translation to MDX, just adding DAX as a full featured query language against the MD cube.
As with every year there were tons of events in the evenings after the sessions stop, including the Exhibitor's Reception on Wednesday night that I had to try and work as "security" at the Pragmatic Works booth for the 200+ free books that Brian Knight (b|t), Adam Jorgensen (b|t), Mike Davis (b|t) and various other current/past Pragmatic Works employees were signing for free! Was a very popular thing at the reception and we were able to offer many more books to the attendees thanks to Wiley/Wrox being in the booth next to us and having boxes of these books there that they could allow us to give out. After we got rid of all the books at the reception, a few of us headed over to get setup at the Hard Rock Cafe for the Pragmatic Works, Microsoft and HP SQLKaraoke Event with a live band! Was an awesome party and the first time that I have seen karaoke done with a live band, which was really cool. Not sure how we are going to top that party next year in Charlotte, but I'm sure we will come up with something! :) Thursday evening was the night that we set for the annual Colorado Dinner at PASS Summit. As in past years we had it at Elliot's Oyster House and as always they were very accommodating for a group of 20+ people that were all paying separately and ordering whatever they wanted. After the dinner we caught a cab over to the EMP (Experience Music Project) to catch the end of the Attendee party. I love this museum, in fact I have been there so many times over the years that it is one that I actually went ahead and became a yearly paying member of. I got to go down to the Science Fiction Museum that is in the lower 2 levels and enjoy all of the great exhibits from my favorite TV and movie memorabilia There was even more SQLKaraoke with a live band there, guess that is the new trend for the Summit this year. I didn't make it to SQLKaraoke at Bush Gardens until Friday evening this year, just too much going on during the week.
And that is it for another Summit, it was a crazy week that went by far too fast. I had a great time meeting everyone for the first time and again and I'm already looking forward to 2013 in Charlotte!
Business Intelligence (BI) blog for Microsoft BI technologies and integration into custom applications
Tuesday, November 27, 2012
Sunday, November 11, 2012
PASS Summit 2012 - Part 1 (Pre-Summit Activities)
While I'm sitting in the Atlanta airport waiting for my connecting flight to Columbia, SC, thought that it was a great time to get some of my thoughts on last week's PASS Summit 2012 posted. This post will cover the activities I did before Summit.
I arrived in Seattle on the Friday morning to spend some time with some of the other Pragmatic Works folks in the Mount Rainer area. Being in Denver, I'm not in direct contact with most of the other Pragmatic Works people on a regular basis, so this was a great way to get to know more of them. Friday night we all ate at the cabin, thanks to Brian Knight our chef for the evening. The cabin was pretty cool, if not a bit old. Had a fireplace, pool table and Foosball table and tons of beds for all of us.
We all attempted to hike up Mt. Rainer first thing Saturday morning. Weather was not great as we left the cabin, raining, foggy and pretty windy. We got to the parking lot to get all of our gear setup and this was the first time I have ever hiked anything more than just some paved trails in Colorado, but I figured I live a mile high so I shouldn't have too many issues. Oh, was I completely wrong! Didn't take me very long to realize that even living at altitude doesn't help if you don't also work out regularly at altitude (those that have seen me recently know just how out of shape I am). Thankfully we were prepared for the full group of 9 people to split up into at least 3 smaller groups as the hike went on. I quickly feel back into the back group, which was great because we were all pushing each other on to continue even when at different times all of us wanted to just quit and go back. We got all of the way up to around 7000 feet and to the point where the snowfield starts and we would have to put on our snowshoes. That was when Mt. Rainer really let us know that it was in control and the wind was blowing at 30mph+ with sleet and snow blowing in our face. We stood there for awhile and realized that we could see the first of the groups coming back our way. Turned out that not too much farther ahead they hit snow that was waist deep and even harder going, so we all made the decision to head back to the cars. No one got to the goal of the basecamp (Camp Muir) this year. According to others that have done this hike at other times the snow was higher then they have ever seen it. It was very grueling for me, but I'm glad that I tried it and made it as far as I did. I don't think that I will ever need to try and hike a mountain again, but you never know.
After the Mt. Rainer adventure all of us headed back to Seattle on Sunday morning. Before leaving I made breakfast for everyone. Thankfully I didn't make anyone sick (or at least they didn't tell me) with my French Toast, Bacon and Hash Brown feast. On the way into Seattle we all stopped at the mall in Bellevue since some in the group wanted to stop at the Microsoft Store and see about getting a Surface. I had seen Brian Knight's Surface at the cabin previously, so I didn't see the need to get one (I'm already an Apple fanboy, I'm more interested in the Windows 8 Pro tablets). This was also my first time in a Microsoft Store, definitely looks like a copy of the Apple Stores, guess you need to copy something that works! :) While I didn't get a Surface I did wander to the back of the store and found an awesome Ogio Bandit backpack that is White and Black with Gold highlights. As soon as I saw that bag I had to have it since those are my Alma mater's Purdue University colors (only thing missing is Purdue patch, which I'm going to fix). After a couple of people bought Surface's we headed off to P.F. Chang's for lunch. Was great to sit down and talk about the weekend and prepare ourselves for the week to come.
We headed out of Bellevue into Seattle and dropped everyone off a their hotels for the next week. I was staying at the Sheraton again this year. Definitely my favorite hotel for PASS Summit in Seattle since you only have to walk half a block to get into the convention center (wish it was cheaper though). The only thing that I found after checking-in that sucked is that our normal staging area for post-conference activities each night the Sheraton bar was closed for renovations. Now we would all have to use Twitter to find out what was going on or make sure to contact attendees at the conference/hotel as we saw each other.
On Monday I didn't have any PASS Summit related activities, so I got up early to make a visit to Top Pot Doughnuts. If you have never been to Top Pot, you should definitely try to go next time you are in Seattle, very good doughnuts! After that I walked down to the pier area to walk around and saw the new Seattle Giant Wheel that was under construction that last 2 trips I made to Seattle. Before lunch I meant up with a few other Pragmatic Works people to take the Seattle Underground tour. I have done this tour twice previously, but it is always great to go on this tour to get the different stories on Seattle's very colorful past. If you have not done the Seattle Underground tour, you should best tour I have done in any city I have visited. Grabbed some lunch after the tour and then headed back to the hotel to recharge before the Networking Event that evening.
That covers all of the pre-Summit activities, so I'll post the details on the rest of the week in a future post.
I arrived in Seattle on the Friday morning to spend some time with some of the other Pragmatic Works folks in the Mount Rainer area. Being in Denver, I'm not in direct contact with most of the other Pragmatic Works people on a regular basis, so this was a great way to get to know more of them. Friday night we all ate at the cabin, thanks to Brian Knight our chef for the evening. The cabin was pretty cool, if not a bit old. Had a fireplace, pool table and Foosball table and tons of beds for all of us.
We all attempted to hike up Mt. Rainer first thing Saturday morning. Weather was not great as we left the cabin, raining, foggy and pretty windy. We got to the parking lot to get all of our gear setup and this was the first time I have ever hiked anything more than just some paved trails in Colorado, but I figured I live a mile high so I shouldn't have too many issues. Oh, was I completely wrong! Didn't take me very long to realize that even living at altitude doesn't help if you don't also work out regularly at altitude (those that have seen me recently know just how out of shape I am). Thankfully we were prepared for the full group of 9 people to split up into at least 3 smaller groups as the hike went on. I quickly feel back into the back group, which was great because we were all pushing each other on to continue even when at different times all of us wanted to just quit and go back. We got all of the way up to around 7000 feet and to the point where the snowfield starts and we would have to put on our snowshoes. That was when Mt. Rainer really let us know that it was in control and the wind was blowing at 30mph+ with sleet and snow blowing in our face. We stood there for awhile and realized that we could see the first of the groups coming back our way. Turned out that not too much farther ahead they hit snow that was waist deep and even harder going, so we all made the decision to head back to the cars. No one got to the goal of the basecamp (Camp Muir) this year. According to others that have done this hike at other times the snow was higher then they have ever seen it. It was very grueling for me, but I'm glad that I tried it and made it as far as I did. I don't think that I will ever need to try and hike a mountain again, but you never know.
After the Mt. Rainer adventure all of us headed back to Seattle on Sunday morning. Before leaving I made breakfast for everyone. Thankfully I didn't make anyone sick (or at least they didn't tell me) with my French Toast, Bacon and Hash Brown feast. On the way into Seattle we all stopped at the mall in Bellevue since some in the group wanted to stop at the Microsoft Store and see about getting a Surface. I had seen Brian Knight's Surface at the cabin previously, so I didn't see the need to get one (I'm already an Apple fanboy, I'm more interested in the Windows 8 Pro tablets). This was also my first time in a Microsoft Store, definitely looks like a copy of the Apple Stores, guess you need to copy something that works! :) While I didn't get a Surface I did wander to the back of the store and found an awesome Ogio Bandit backpack that is White and Black with Gold highlights. As soon as I saw that bag I had to have it since those are my Alma mater's Purdue University colors (only thing missing is Purdue patch, which I'm going to fix). After a couple of people bought Surface's we headed off to P.F. Chang's for lunch. Was great to sit down and talk about the weekend and prepare ourselves for the week to come.
We headed out of Bellevue into Seattle and dropped everyone off a their hotels for the next week. I was staying at the Sheraton again this year. Definitely my favorite hotel for PASS Summit in Seattle since you only have to walk half a block to get into the convention center (wish it was cheaper though). The only thing that I found after checking-in that sucked is that our normal staging area for post-conference activities each night the Sheraton bar was closed for renovations. Now we would all have to use Twitter to find out what was going on or make sure to contact attendees at the conference/hotel as we saw each other.
On Monday I didn't have any PASS Summit related activities, so I got up early to make a visit to Top Pot Doughnuts. If you have never been to Top Pot, you should definitely try to go next time you are in Seattle, very good doughnuts! After that I walked down to the pier area to walk around and saw the new Seattle Giant Wheel that was under construction that last 2 trips I made to Seattle. Before lunch I meant up with a few other Pragmatic Works people to take the Seattle Underground tour. I have done this tour twice previously, but it is always great to go on this tour to get the different stories on Seattle's very colorful past. If you have not done the Seattle Underground tour, you should best tour I have done in any city I have visited. Grabbed some lunch after the tour and then headed back to the hotel to recharge before the Networking Event that evening.
That covers all of the pre-Summit activities, so I'll post the details on the rest of the week in a future post.
Labels:
Mt. Rainer,
Ogio,
PASS,
PASS Summit 2012,
Top Pot,
Underground Tour
Midlands SQL PASS Chapter Meeting - 11/13/2012
Just a quick note that I will be going pretty much straight from PASS Summit 2012 in Seattle (stay tuned for detailed blog post on this) to Columbia, SC and presenting "SSIS 2012: More Than Just a Pretty UI" at the Midlands SQL PASS Chapter meeting on Tuesday, 11/13 at 5:30pm. If you are in the Columbia, SC area on Tuesday, 11/13, please stop by. Looking forward to presenting at another PASS SQL chapter next week!
For more details go to the Midlands SQL PASS Chapter website.
For more details go to the Midlands SQL PASS Chapter website.
Monday, October 15, 2012
SQL Saturday #153 and PASS Summit 2012
For those in the Salt Lake City area this weekend, I will be speaking at the SQL Saturday event that will be held on Saturday, October 20 starting at 9am. I am currently scheduled to speak at 2:45pm in Room 206 on SSIS 2012. For those that are new to the SQL community, SQL Saturday's are free events held throughout the world where SQL Server Professionals share their experiences for a day of learning and networking. If you are in the SLC area this Saturday, you should come by! For more information and to register, please go to http://www.utahsqlsaturday.com. See you on Saturday!
I will also be attending PASS Summit 2012 in Seattle coming up on November 5-9. If have the means to go to PASS Summit, I would highly recommend it, it is the biggest SQL Server focused conference in the US and is a great place to meet SQL Server Professionals from all over the world. There are so many things to do and see both at the conference and after the conference. To get all of the details and register, go to http://www.sqlpass.org/summit. If you are from Colorado and traveling to PASS Summit 2012, we will be having our annual dinner at Elliot's Oyster House on Thursday, November 8th at 6:30pm. You will have to cover your own food and drink, but it is a great chance to meet with the SQL Professionals in the Colorado area outside of the conference. If you are interested, please register at http://cosummit2012dinner.eventbrite.com. Hope to see you there!
I will also be attending PASS Summit 2012 in Seattle coming up on November 5-9. If have the means to go to PASS Summit, I would highly recommend it, it is the biggest SQL Server focused conference in the US and is a great place to meet SQL Server Professionals from all over the world. There are so many things to do and see both at the conference and after the conference. To get all of the details and register, go to http://www.sqlpass.org/summit. If you are from Colorado and traveling to PASS Summit 2012, we will be having our annual dinner at Elliot's Oyster House on Thursday, November 8th at 6:30pm. You will have to cover your own food and drink, but it is a great chance to meet with the SQL Professionals in the Colorado area outside of the conference. If you are interested, please register at http://cosummit2012dinner.eventbrite.com. Hope to see you there!
Labels:
SQL 2012,
SQL Saturday,
SSIS,
training
Wednesday, September 26, 2012
SQL Saturday #169 Denver Wrap-up
SQL Saturday #169 Denver was held last Saturday (9/22) at the Cherry Creek Presbyterian Church - Community Life Center, after months of planning and worrying about how the event would go, it turned out to be a very successful event! We met our registration cut-off amount of 300 late on Friday night and the final count at the end of day on Saturday was around 220 that actually attended! Really good since we did not hold a SQL Saturday in 2011 (the first for this area was in September 2010) and shows just how much the SQL Server community has grown.
We started the planning of this event in May/June, which is a bit late for an event of this size. We pulled together a great group of volunteers from the area to help us out and that made all the difference. It is great to see so many people that are willing to give up some of their free time to make sure that we had a successful event. Without them this event would not have been as good as it was, so thank you to everyone that volunteered to help!
The number of sponsors that we had was also a huge help to make the event such a success. Big kudos to Confio Software, they are a local Colorado company and they came through big as a Gold sponsor with lots of last minute support in various forms. I also want to send out a big thank you to Quanta Intelligence for sponsoring the Speaker/Volunteer dinner on Friday, everyone had a great time before all of the hard work on Saturday started. Another big thanks to Datavail who came in as a Gold level sponsor and sponsored the After Party on Saturday evening. Without these sponsors and all of the others - DawaBI, Pragmatic Works, Xtivia, PASS, Red Gate, Fusion IO, Quest Software, COZYROC, Qortex, SQL Cruise, Sapien Technlogies, McGraw Hill, Wrox and Pluralsight we could not had this event.
Since this was the first time that I was the lead coordinator on this kind of an event I decided early on that I would not present a session, since I knew that I would already have enough stress for the day. That was a good decision because there were many challenges that needed my attention throughout the day. We have a list of things that we know about now and will keep in mind for the setup of the next event, and for those wondering, yes I'm sure that we will be having another SQL Saturday in the near future. Overall it was a great learning experience I'm looking forward to the next one and being able to enjoy it more and to roam around and soak it all in more.
To get a different perspective on the event, check out the following blog posts that have been posted by others:
We started the planning of this event in May/June, which is a bit late for an event of this size. We pulled together a great group of volunteers from the area to help us out and that made all the difference. It is great to see so many people that are willing to give up some of their free time to make sure that we had a successful event. Without them this event would not have been as good as it was, so thank you to everyone that volunteered to help!
The number of sponsors that we had was also a huge help to make the event such a success. Big kudos to Confio Software, they are a local Colorado company and they came through big as a Gold sponsor with lots of last minute support in various forms. I also want to send out a big thank you to Quanta Intelligence for sponsoring the Speaker/Volunteer dinner on Friday, everyone had a great time before all of the hard work on Saturday started. Another big thanks to Datavail who came in as a Gold level sponsor and sponsored the After Party on Saturday evening. Without these sponsors and all of the others - DawaBI, Pragmatic Works, Xtivia, PASS, Red Gate, Fusion IO, Quest Software, COZYROC, Qortex, SQL Cruise, Sapien Technlogies, McGraw Hill, Wrox and Pluralsight we could not had this event.
Since this was the first time that I was the lead coordinator on this kind of an event I decided early on that I would not present a session, since I knew that I would already have enough stress for the day. That was a good decision because there were many challenges that needed my attention throughout the day. We have a list of things that we know about now and will keep in mind for the setup of the next event, and for those wondering, yes I'm sure that we will be having another SQL Saturday in the near future. Overall it was a great learning experience I'm looking forward to the next one and being able to enjoy it more and to roam around and soak it all in more.
To get a different perspective on the event, check out the following blog posts that have been posted by others:
Thanks again to everyone that was involved in our SQL Saturday and a big thanks to everyone that attended the event!
Labels:
SQL Saturday,
training
Friday, August 17, 2012
SQL Server 2012 Database Projects
After I got really confused on why some things were possible in different combinations of SQL Server 2012 and Visual Studio 2010 for SQL Server 2012 Database Projects I thought that I would get what I found out for others too that might be as confused as I was.
For those that might not know Microsoft introduced Database Projects awhile back, but they had some issues where the scripts created didn't always work and the schema comparisons didn't work well either. Being a BI Developer at that time I didn't spend much time looking at it, just heard what the other DB Developers were complaining about. Database Projects were meant to be a way for DB Developers and DBAs to help get the database "source code" into version control and make it easier to create scripts for deployment into multiple environments.
So, with SQL Server 2012 there were some changes made to the Database Projects and from what I have seen using it lately they are some very good improvements. But, the side effect is that there are 2 tools with a very similar name that are slightly different. The 2 tools are SQL Server Data Tools and Microsoft SQL Server Data Tools (wow big difference huh, one with Microsoft in front one without).
SQL Server Data Tools (SSDT)
For those that have installed any version of Microsoft SQL Server 2012 you know that you can install SQL Server Data Tools as part of the database instance install. This is the tool that uses the Visual Studio 2010 Shell and is the replacement for the Business Intelligence Development Studio (BIDS) in previous versions of SQL Server. Since they wanted to add more functionality then just BI development projects the tool needed to be renamed. SSDT still works with all of the BI projects as before.
Microsoft SQL Server Data Tools
This is the web install version that is available for anyone to download and install without any requirements for SQL Server 2012. As with SSDT this is built on the Visual Studio 2010 Shell. The difference with this version is that it only does the Database Project development (including SQL Azure). If you install only this version of SSDT you will NOT be able to do any of BI development unless you also have the SSDT from SQL Server 2012 installed.
Not sure why Microsoft went down this route and I find this very confusing and it gets even worse based on the version of Visual Studio 2010 that you have installed. If you want to do both BI Projects and Database Projects you will need to install the SSDT from the SQL Server 2012 install media AND the Microsoft SQL Server Data Tools from the web link above. But, even with this there are some differences in the projects! The pieces that I found which are missing are only in the Database Projects and are specific to building these projects using the new SQL Server Database Projects.
Below is what the SQL Server Database Project structure looks like when you create it using the web download of SSDT and no version of Visual Studio 2010 installed:
The screenshot below is against the exact same SQL 2012 database, the only difference is that I created this first as a new SQL Server Data-tier Application Project and then converted it to a SQL Server Database Project (you do this by right-clicking on project file in the Solution Explorer and selecting Convert to SQL Server Database Project...) on a machine that had Visual Studio 2010 Ultimate (Premium also works from what I found researching this) and the SSDT web install:
Notice the differences? There are some folders and scripts that are not being created by the "lite" version, like the Scripts folder with the Pre-Deployment/Post-Deployment files along with others. Based on what I have started to script I find the Pre/Post Deployment scripts to be very useful since they are pulled in automatically when you create the deployment scripts. I use the Post-Deployment script to setup SQL Agent jobs or anything else that can be done after the database has been created.
You maybe asking, can I just create that folder structure manually in the "lite" project and have it automatically pull that in? Microsoft has actually included all of the pre/post deployment scripting capabilities in the new SQL Server Data Projects and you can create them anywhere in your project by adding a new script item to your project and selecting the Pre-Deployment Script or Post-Deployment Script types that appear in the Add New Item dialog. While I don't understand why the initial creation of the projects has changed so that these folders/scripts are not automatically included at least they are both capable of doing it you just have to know how to add it.
I have verified that if you create the SQL Server Database Project as a converted SQL Server Data-tier Application Project in Visual Studio Ultimate/Premium and then open it in the "lite" version it does still see all the extra folders and it will script them correctly.
So this is what I have found after playing around for awhile and just about pulling out my hair when I couldn't figure out why one worked differently then the other. I believe that most of these differences comes from how the SQL Server Data-tier Application Projects (DAC) were setup and those differences in that project type are being carried over when it is converted to a SQL Server Database Project. I still find it very confusing that creating the project in one way and converting to the "new" way causes so many of these differences. Microsoft could have addressed this by making sure that the new SQL Server Database Projects for SQL Server 2012 have the exact same structure as the previous version, but it seems that they see each of these "new" project types as a chance to start over and do it differently.
For those that might not know Microsoft introduced Database Projects awhile back, but they had some issues where the scripts created didn't always work and the schema comparisons didn't work well either. Being a BI Developer at that time I didn't spend much time looking at it, just heard what the other DB Developers were complaining about. Database Projects were meant to be a way for DB Developers and DBAs to help get the database "source code" into version control and make it easier to create scripts for deployment into multiple environments.
So, with SQL Server 2012 there were some changes made to the Database Projects and from what I have seen using it lately they are some very good improvements. But, the side effect is that there are 2 tools with a very similar name that are slightly different. The 2 tools are SQL Server Data Tools and Microsoft SQL Server Data Tools (wow big difference huh, one with Microsoft in front one without).
SQL Server Data Tools (SSDT)
For those that have installed any version of Microsoft SQL Server 2012 you know that you can install SQL Server Data Tools as part of the database instance install. This is the tool that uses the Visual Studio 2010 Shell and is the replacement for the Business Intelligence Development Studio (BIDS) in previous versions of SQL Server. Since they wanted to add more functionality then just BI development projects the tool needed to be renamed. SSDT still works with all of the BI projects as before.
Microsoft SQL Server Data Tools
This is the web install version that is available for anyone to download and install without any requirements for SQL Server 2012. As with SSDT this is built on the Visual Studio 2010 Shell. The difference with this version is that it only does the Database Project development (including SQL Azure). If you install only this version of SSDT you will NOT be able to do any of BI development unless you also have the SSDT from SQL Server 2012 installed.
Not sure why Microsoft went down this route and I find this very confusing and it gets even worse based on the version of Visual Studio 2010 that you have installed. If you want to do both BI Projects and Database Projects you will need to install the SSDT from the SQL Server 2012 install media AND the Microsoft SQL Server Data Tools from the web link above. But, even with this there are some differences in the projects! The pieces that I found which are missing are only in the Database Projects and are specific to building these projects using the new SQL Server Database Projects.
Below is what the SQL Server Database Project structure looks like when you create it using the web download of SSDT and no version of Visual Studio 2010 installed:
The screenshot below is against the exact same SQL 2012 database, the only difference is that I created this first as a new SQL Server Data-tier Application Project and then converted it to a SQL Server Database Project (you do this by right-clicking on project file in the Solution Explorer and selecting Convert to SQL Server Database Project...) on a machine that had Visual Studio 2010 Ultimate (Premium also works from what I found researching this) and the SSDT web install:
Notice the differences? There are some folders and scripts that are not being created by the "lite" version, like the Scripts folder with the Pre-Deployment/Post-Deployment files along with others. Based on what I have started to script I find the Pre/Post Deployment scripts to be very useful since they are pulled in automatically when you create the deployment scripts. I use the Post-Deployment script to setup SQL Agent jobs or anything else that can be done after the database has been created.
You maybe asking, can I just create that folder structure manually in the "lite" project and have it automatically pull that in? Microsoft has actually included all of the pre/post deployment scripting capabilities in the new SQL Server Data Projects and you can create them anywhere in your project by adding a new script item to your project and selecting the Pre-Deployment Script or Post-Deployment Script types that appear in the Add New Item dialog. While I don't understand why the initial creation of the projects has changed so that these folders/scripts are not automatically included at least they are both capable of doing it you just have to know how to add it.
I have verified that if you create the SQL Server Database Project as a converted SQL Server Data-tier Application Project in Visual Studio Ultimate/Premium and then open it in the "lite" version it does still see all the extra folders and it will script them correctly.
So this is what I have found after playing around for awhile and just about pulling out my hair when I couldn't figure out why one worked differently then the other. I believe that most of these differences comes from how the SQL Server Data-tier Application Projects (DAC) were setup and those differences in that project type are being carried over when it is converted to a SQL Server Database Project. I still find it very confusing that creating the project in one way and converting to the "new" way causes so many of these differences. Microsoft could have addressed this by making sure that the new SQL Server Database Projects for SQL Server 2012 have the exact same structure as the previous version, but it seems that they see each of these "new" project types as a chance to start over and do it differently.
Labels:
SQL 2012
Tuesday, July 24, 2012
SQL Saturday #169 - One Week Left!
Only one week left to submit your sessions for SQL Saturday #169 in Denver on September 22, 2012! If you have a topic to share and you are going to be in the Denver area on September 22nd, please submit your session for this great event by 7/31/2012!
We are still looking for sponsors as well, so please take a look at the sponsorship options that we have available and register here.
If you are going to attend the event, we ask that you register so that we know how many to expect.
Looking forward to putting on a great event in a couple of months!
We are still looking for sponsors as well, so please take a look at the sponsorship options that we have available and register here.
If you are going to attend the event, we ask that you register so that we know how many to expect.
Looking forward to putting on a great event in a couple of months!
Labels:
SQL Saturday
Monday, July 2, 2012
SQL Saturday
Wanted to put out a quick post that I have not had the time to put up until now. If you are going to be in the Denver, CO area around September 22, 2012 I would encourage you to register for a day of SQL Server technical training given by many local and national speakers. Yes, that's right we are doing SQL Saturday #169 at the same location as 2010, Cherry Creek Presbyterian Church (10150 E Belleview Ave., Greenwood Village, CO 80111).
We are asking that anyone that registers pay a $10 fee for lunch, this will help us to provide all attendees with a lunch that is not pizza and provide plenty of beverages throughout the event. Please register to attend and we look forward to seeing you there!
If you are a speaker, please submit your session abstracts on our Call for Speakers page. And if you are sponsor, please go to our Sponsors page to help us out!
I would also like to mention that Brian Knight, Founder of Pragmatic Works will be presenting a 2 day SSAS workshop on September 20-21 at the Denver Microsoft office (7595 Technology Way, Suite 400, Denver, CO 80237). If you are interested in attending that event, you can go to the Pragmatic Works registration page linked here. You will need to bring your own computer to that workshop with the SQL bits already installed along with AdventureWorks installed, more details can be found on the link above.
Also, in other SQL Saturday news I will be presenting at SQL Saturday #159 in Kansas City, MO on August 4, 2012. Looking forward to getting some great BBQ in Kansas City and presenting my "SSIS 2012: More Than Just a Pretty UI" session, so come see me if you are going to be at SQL Saturday #159!
I will post more on my presentation at SQL Saturday #159 and our own SQL Saturday #169 over the next few months!
We are asking that anyone that registers pay a $10 fee for lunch, this will help us to provide all attendees with a lunch that is not pizza and provide plenty of beverages throughout the event. Please register to attend and we look forward to seeing you there!
If you are a speaker, please submit your session abstracts on our Call for Speakers page. And if you are sponsor, please go to our Sponsors page to help us out!
I would also like to mention that Brian Knight, Founder of Pragmatic Works will be presenting a 2 day SSAS workshop on September 20-21 at the Denver Microsoft office (7595 Technology Way, Suite 400, Denver, CO 80237). If you are interested in attending that event, you can go to the Pragmatic Works registration page linked here. You will need to bring your own computer to that workshop with the SQL bits already installed along with AdventureWorks installed, more details can be found on the link above.
Also, in other SQL Saturday news I will be presenting at SQL Saturday #159 in Kansas City, MO on August 4, 2012. Looking forward to getting some great BBQ in Kansas City and presenting my "SSIS 2012: More Than Just a Pretty UI" session, so come see me if you are going to be at SQL Saturday #159!
I will post more on my presentation at SQL Saturday #159 and our own SQL Saturday #169 over the next few months!
Labels:
Pragmatic Works,
SQL Saturday,
training
Monday, June 4, 2012
Wormhole Switch
No, this is not another space post, but based on the title you might think it is. This is actually what I would consider more of a productivity type post.
My computer setup in my home office is probably very similar to others, where I have multiple computers and multiple screens for each. But, my office setup really only allows for one full size keyboard and mouse that can easily fit on my desk for both computers.
In the past the easy solution to this was to get a KVM (Keyboard Video Mouse) switch. KVMs can be very flexible and work in just about all situations. I was able to use one with my wireless Logitech USB Keyboard and Mouse, but the issue is that I had to manually click a button to switch the keyboard and mouse between the 2 computers (I didn't need to use the video in this case since they were hardwired to the individual computers). Not only do you have to manually switch between the computers but the cables are usually pretty bulky and you have to have them setup to both computers. Some of the newer KVMs do have hotkeys you can setup on the keyboard to switch, but for me this was still not good enough.
Recently there have been many software based solutions that use you existing Wi-Fi or any networking connection to do the same thing as long as both computers are on the same network. I tried different versions of this software like Synergy and Microsoft Garage Mouse Without Borders and none of them worked for me because the one computer that I'm using daily is always connected via VPN to the company I'm working for. Once the VPN connection is started all of these software solutions stopped working because the VPN software "locks" down the network connections and prevented that computer from being visible to the other one. I have read that there are ways to work around this, but it requires getting your networking people to open parts of the VPN, which in my case was not going to happen.
So, based on this I knew that I probably needed a physical hardwired solutions similar to the KVM. Not sure where I saw this online, but there was an article or mention of this new USB solution from J5 Create called Wormhole Switch JUC400.
I'm not sure technically how all of this cable works, but the great thing is there is no disc or downloads required. As soon as you plug it into the first computer it auto detects and installs the required drivers/software (I've only tested this on Windows 7 computers so far) and waits for the other computer to be connected. Plug the other end to the 2nd computer and the same will happen there and then both computers should show messages that they are communicating with each other.
At that point you just move the mouse across the screens and the mouse/keyboard should move to the computer the mouse is currently on. You might need to configure on which side the computers are on, but that is very easy by right clicking on the Wormhole Server icon in the Windows System Tray. Since this is a hardwired connection it works even when I have started the VPN connection on either or both computers. It's also nice in that it is a single regular USB cable. The version that I purchased (JUC400) works on Windows, Macs and even the iPad with the USB connection kit (only the keyboard is usable on the iPad). There are 2 other models, one that works with Windows/Android devices (JUC200) and the other that is just the basic Windows only version (JUC100). All of the versions also offer clipboard/file sharing between the connected computers which is a really useful feature.
The warning that I would give is that this is only a keyboard and mouse sharing solution, you cannot switch the monitors with this cable. The other warning that I would give is that if one of the computers goes to sleep or is locked it might be difficult if not impossible to get the keyboard/mouse to switch over to it. For my setup I always use the laptop screen on one computer, so the keyboard/touchpad is accessible on that computer at all times, so I moved the USB receiver for my keyboard/mouse over to the other computer (which I consider to be my main computer), then I can just use the laptop keyboard/touchpad on the other computer when that does happen. Not a major issue for me, but it might cause others some challenges depending on your setup.
For my setup this was the perfect solution, so I wanted to share it with others in case it might help you out as well. The model that I purchased (JUC400) is only $39.99 from Amazon and the other 2 models are even cheaper at $23.35 (JUC200) and $24.99 (JUC100).
My computer setup in my home office is probably very similar to others, where I have multiple computers and multiple screens for each. But, my office setup really only allows for one full size keyboard and mouse that can easily fit on my desk for both computers.
In the past the easy solution to this was to get a KVM (Keyboard Video Mouse) switch. KVMs can be very flexible and work in just about all situations. I was able to use one with my wireless Logitech USB Keyboard and Mouse, but the issue is that I had to manually click a button to switch the keyboard and mouse between the 2 computers (I didn't need to use the video in this case since they were hardwired to the individual computers). Not only do you have to manually switch between the computers but the cables are usually pretty bulky and you have to have them setup to both computers. Some of the newer KVMs do have hotkeys you can setup on the keyboard to switch, but for me this was still not good enough.
Recently there have been many software based solutions that use you existing Wi-Fi or any networking connection to do the same thing as long as both computers are on the same network. I tried different versions of this software like Synergy and Microsoft Garage Mouse Without Borders and none of them worked for me because the one computer that I'm using daily is always connected via VPN to the company I'm working for. Once the VPN connection is started all of these software solutions stopped working because the VPN software "locks" down the network connections and prevented that computer from being visible to the other one. I have read that there are ways to work around this, but it requires getting your networking people to open parts of the VPN, which in my case was not going to happen.
So, based on this I knew that I probably needed a physical hardwired solutions similar to the KVM. Not sure where I saw this online, but there was an article or mention of this new USB solution from J5 Create called Wormhole Switch JUC400.
At that point you just move the mouse across the screens and the mouse/keyboard should move to the computer the mouse is currently on. You might need to configure on which side the computers are on, but that is very easy by right clicking on the Wormhole Server icon in the Windows System Tray. Since this is a hardwired connection it works even when I have started the VPN connection on either or both computers. It's also nice in that it is a single regular USB cable. The version that I purchased (JUC400) works on Windows, Macs and even the iPad with the USB connection kit (only the keyboard is usable on the iPad). There are 2 other models, one that works with Windows/Android devices (JUC200) and the other that is just the basic Windows only version (JUC100). All of the versions also offer clipboard/file sharing between the connected computers which is a really useful feature.
The warning that I would give is that this is only a keyboard and mouse sharing solution, you cannot switch the monitors with this cable. The other warning that I would give is that if one of the computers goes to sleep or is locked it might be difficult if not impossible to get the keyboard/mouse to switch over to it. For my setup I always use the laptop screen on one computer, so the keyboard/touchpad is accessible on that computer at all times, so I moved the USB receiver for my keyboard/mouse over to the other computer (which I consider to be my main computer), then I can just use the laptop keyboard/touchpad on the other computer when that does happen. Not a major issue for me, but it might cause others some challenges depending on your setup.
For my setup this was the perfect solution, so I wanted to share it with others in case it might help you out as well. The model that I purchased (JUC400) is only $39.99 from Amazon and the other 2 models are even cheaper at $23.35 (JUC200) and $24.99 (JUC100).
Labels:
KVM,
productivity,
USB
Thursday, May 31, 2012
Dragon
I wanted to take a break from SQL BI today to post about the success of SpaceX's Falcon 9 rocket with the Dragon spacecraft. For those that may not be up-to-date on the space program here in the United States, the Space Shuttle fleet was retired last year after more than 30 years of service. This left a huge void in the US capabilities to get supplies and astronauts up to and off of the International Space Station (ISS). NASA has been working with private industry in the US to provide these services so that NASA could focus on longer duration space missions. Many companies in the US have been competing for these contracts and SpaceX was one of the first. SpaceX was co-founded by Elon Musk, who also co-founded PayPal and Tesla Motors.
Fast forward to the last couple of years where SpaceX has had their ups and downs on getting their own rockets to launch. But in late 2010 they were able to successfully launch their Falcon 9 rocket (the number 9 signifies the number of rocket engines) with the Dragon capsule. That launch made them the first private company to launch a spacecraft of their own into orbit and return a capsule back to Earth. The next step was going to be similar with a close approach to the ISS, but not be able to dock. After proving their capabilities in that first launch NASA and SpaceX decided to combine 2 flights and allow them to attempt the docking with ISS on the next launch.
On May 22, 2012 SpaceX launched another Falcon 9 rocket with another Dragon capsule on board. This time the Dragon capsule was loaded with over 1,000 pounds of cargo for the ISS. The Dragon capsule was able to prove that it could perform the required tasks for docking and it was able to be successfully captured by the ISS on May 25, 2012. The crew of the ISS was able to open the capsule and swap out the cargo with items that could be returned to Earth. This is a very important difference then any of the other vehicles that can currently dock with the ISS, as this is the only one currently proven that can return cargo back to Earth. The Dragon capsule lands in the ocean the same way the Mercury, Gemini and Apollo capsules used to. And today the Dragon capsule was released from ISS and safely landed in the Pacific Ocean.
This is a huge step in US space capabilities and I'm sure that it will lead to many other successes in the future for SpaceX and the other private companies that will be providing this capability. I'm sure there will be more setbacks, but I hope that these companies can continue to provide these capabilities and others. Hopefully SpaceX will also be allowed to use these Dragon capsules to take human crews up to the ISS and other space habitats built in the future and maybe even go on from there to the Moon or further.
Congratulations SpaceX and NASA!
Fast forward to the last couple of years where SpaceX has had their ups and downs on getting their own rockets to launch. But in late 2010 they were able to successfully launch their Falcon 9 rocket (the number 9 signifies the number of rocket engines) with the Dragon capsule. That launch made them the first private company to launch a spacecraft of their own into orbit and return a capsule back to Earth. The next step was going to be similar with a close approach to the ISS, but not be able to dock. After proving their capabilities in that first launch NASA and SpaceX decided to combine 2 flights and allow them to attempt the docking with ISS on the next launch.
On May 22, 2012 SpaceX launched another Falcon 9 rocket with another Dragon capsule on board. This time the Dragon capsule was loaded with over 1,000 pounds of cargo for the ISS. The Dragon capsule was able to prove that it could perform the required tasks for docking and it was able to be successfully captured by the ISS on May 25, 2012. The crew of the ISS was able to open the capsule and swap out the cargo with items that could be returned to Earth. This is a very important difference then any of the other vehicles that can currently dock with the ISS, as this is the only one currently proven that can return cargo back to Earth. The Dragon capsule lands in the ocean the same way the Mercury, Gemini and Apollo capsules used to. And today the Dragon capsule was released from ISS and safely landed in the Pacific Ocean.
This is a huge step in US space capabilities and I'm sure that it will lead to many other successes in the future for SpaceX and the other private companies that will be providing this capability. I'm sure there will be more setbacks, but I hope that these companies can continue to provide these capabilities and others. Hopefully SpaceX will also be allowed to use these Dragon capsules to take human crews up to the ISS and other space habitats built in the future and maybe even go on from there to the Moon or further.
Congratulations SpaceX and NASA!
Monday, May 21, 2012
SSIS 2012 Microsoft Connector for Oracle Upgrade Issues
For anyone that uses SSIS to get data from Oracle data sources you have probably had the pain of working with the components provided by Microsoft in the default installation of SSIS. You may have even downloaded the free Oracle tools for Windows that provide some more options for connecting to Oracle. Both of these choices are ok and may work fine in your environments, but for the best solution Microsoft worked with Attunity to make their connectors available for everyone to use without any additional licensing cost (FREE)! These connectors have been considered the best Oracle connectors that you can use with SSIS for quite awhile now. When SQL Server 2012 was released in April v2.0 of these connectors were also released, along with an updated v1.2 for older versions of SSIS.
Now for the twist in the story, what is the upgrade path if you are currently using Microsoft Connectors v1.1 for Oracle by Attunity (wow, that is a mouth full) and you want to upgrade your SSIS packages to SSIS 2012? Logic would tell you that you should be able to install SSIS 2012 and the Microsoft Connectors v2.0 for Oracle by Attunity and then just run the upgrade wizard in SSDT (SQL Server Data Tools) to get your packages up and running in SSIS 2012 and latest Oracle connector, right? Not so fast, unfortunately if you try this your package will be upgraded to SSIS 2012, but your Oracle components that you created using the older versions of the Attunity connectors will no longer work and you may not be able to edit or even delete them from your package! Below is a screenshot that shows what your component will look like after the upgrade:
You will also see errors, similar to this:
Error loading xxx.dtsx: The component is missing, not registered, not upgradeable, or missing required interfaces. The contact information for this component is "Oracle Source;Microsoft Connector for Oracle by Attunity; Attunity Ltd.; All Rights Reserved; http://www.attunity.com;2".
Error loading xxx.dtsx: The component metadata for "Oracle Source" could not be upgraded to the newer version of the componenet. The PerformUpgrade method failed.
You will not be able to edit the component to try and fix it and depending on how many of these components you are using in each package you may even see this error message when you try to delete the component from your package:
So, if you are unable to edit or even delete these invalid components from you package, what do you do? The only option available to you at this point is to edit the .dtsx file in a text editor so that you can see the XML, but if you go that route and you are not very familiar with XML structure or how the XML for SSIS is setup you might end up making a bigger mess.
Thankfully, after a week or so of fighting this same issue for a client that I'm currently working with we were able to get a solution from Microsoft Support that is very quick and will prevent you from losing any of your query, mapping or metadata! The fix is to use a text editor (I prefer Notepad++) and you can do a search and replace for the following GUID in your packages:
{4CAC6073-BCA7-430E-BD29-68A9F0012C6D}
and replace it with:
{CB67CD40-126C-4280-912D-2A625DFAFB66}
The first GUID should be unique to v1.1 of the connector, so if you have upgraded to v1.2 that GUID may be different.
Once I completed this on all 120+ packages that used the Oracle connector and reloaded the updated versions in SSDT all of the error messages went away and the data flows appeared correctly as shown below:
I am not sure if this will be a continuing issue with these connectors as new versions are released or if this is because of the major changes required going from previous versions of SSIS to SSIS 2012. Either way, I wanted to make sure this information was out there for those are upgrading to avoid some of the problems we had.
Now for the twist in the story, what is the upgrade path if you are currently using Microsoft Connectors v1.1 for Oracle by Attunity (wow, that is a mouth full) and you want to upgrade your SSIS packages to SSIS 2012? Logic would tell you that you should be able to install SSIS 2012 and the Microsoft Connectors v2.0 for Oracle by Attunity and then just run the upgrade wizard in SSDT (SQL Server Data Tools) to get your packages up and running in SSIS 2012 and latest Oracle connector, right? Not so fast, unfortunately if you try this your package will be upgraded to SSIS 2012, but your Oracle components that you created using the older versions of the Attunity connectors will no longer work and you may not be able to edit or even delete them from your package! Below is a screenshot that shows what your component will look like after the upgrade:
Error loading xxx.dtsx: The component is missing, not registered, not upgradeable, or missing required interfaces. The contact information for this component is "Oracle Source;Microsoft Connector for Oracle by Attunity; Attunity Ltd.; All Rights Reserved; http://www.attunity.com;2".
Error loading xxx.dtsx: The component metadata for "Oracle Source" could not be upgraded to the newer version of the componenet. The PerformUpgrade method failed.
You will not be able to edit the component to try and fix it and depending on how many of these components you are using in each package you may even see this error message when you try to delete the component from your package:
So, if you are unable to edit or even delete these invalid components from you package, what do you do? The only option available to you at this point is to edit the .dtsx file in a text editor so that you can see the XML, but if you go that route and you are not very familiar with XML structure or how the XML for SSIS is setup you might end up making a bigger mess.
Thankfully, after a week or so of fighting this same issue for a client that I'm currently working with we were able to get a solution from Microsoft Support that is very quick and will prevent you from losing any of your query, mapping or metadata! The fix is to use a text editor (I prefer Notepad++) and you can do a search and replace for the following GUID in your packages:
{4CAC6073-BCA7-430E-BD29-68A9F0012C6D}
and replace it with:
{CB67CD40-126C-4280-912D-2A625DFAFB66}
The first GUID should be unique to v1.1 of the connector, so if you have upgraded to v1.2 that GUID may be different.
Once I completed this on all 120+ packages that used the Oracle connector and reloaded the updated versions in SSDT all of the error messages went away and the data flows appeared correctly as shown below:
Friday, May 11, 2012
SSIS 2012 Presentations
I'm going to be getting back into presenting again! I used to present on a variety of topics at local User Groups in the past, but took an extended break when I got really busy at work and life. Things are finally settling down again and now I'm really looking forward to getting back into presenting on a more regular basis.
My first presentation will be a 30 minute presentation at this Thursday's Denver SQL Server User Group meeting. This presentation will be a preview of a presentation that I hope to present in many different venues on the new features in SSIS 2012. I came up with what I think is a very clever title for these presentations, "SSIS 2012: More Than Just a Pretty UI". I was looking for a title that was catchy and also captured what a lot of people have already heard about SSIS 2012. This will be a demo only presentation showing as many features as I can, and there probably are enough new features in SSIS alone that I could go for hours. I'll probably change around what I demo each time I present as I find out what items demo better then others. For those that have not presented, this is probably the most dangerous type of presentation to do, where you are doing as many live demos as you can.
I will also be doing the first full length version of this presentation on Saturday 5/19/2012 at Rocky Mountain Tech Trifecta! I'm really happy to be a speaker at this event, I have attended this event and even volunteered to help set it up in the past. This will be my first time presenting at this event, so I'm really looking forward to seeing everyone at this annual free training event that covers Development, SharePoint, Office and SQL Server among many other topics. Make sure you register at http://www.rmtechtrifecta.com/#register and come to my presentation!
Keep your eyes open and I may be presenting in an area near you!
Labels:
Microsoft BI,
SQL 2012,
SSIS
Tuesday, April 17, 2012
Virtualization on Laptops and Windows 8
I wanted to cover something that is not Microsoft BI specific today, but still related.
I love to have the newest OS and software available to try out things and for presentations/demos, and like anyone else that does this I prefer to not carry around multiple laptops or connect via sometimes unreliable Internet connections to servers to do this. Over the years I have tried many different versions of virtualization technologies on both Microsoft and Apple platforms. I have used the paid versions like VMWare and Parallels along with the freely available ones like VirtualPC and VirtualBox. While all of these have their pluses and minuses lately I have been using Microsoft's Hyper-V technologies. Currently the biggest downside to using Hyper-V is that it is only available in it's most complete form in Windows Server 2008R2. This is a bit inconvenient for use on a laptop, but I currently have my MacBook Pro setup with BootCamp (Apple's dual-boot technology) to allow me to run both OS X Lion and Windows Server 2008R2 Enterprise without any virtualization. Then I can use Hyper-V in the Windows partition without any issues.
To get my Windows partition setup on the Mac so that it works more like Windows 7 then Windows Server I used the resources on win2008r2workstation.com and Mathieu Chateau's blog to get everything up and running as good as possible (I still have some strange driver issue, but it doesn't prevent any of the regular functions from working correctly). With all of these changes I'm still able to do "normal" things in the Windows partition like run Office and play Star Wars: The Old Republic. I have even been able to take advantage of the Thunderbolt port on my MacBook Pro in Windows to drive the 27" Apple Thunderbolt display and a 1TB LaCie External Thunderbolt harddrive. With the 16GB RAM and the 512GB SSD in my MacBook Pro this setup as worked very well for over the last 18 months.
Now with the new Lenovo laptop that I got from Pragmatic Works I have been trying to figure out a better solution than using VirtualBox. Don't get me wrong, VirtualBox is a great tool and is the only free solution that allows you to run both 32bit and 64bit virtual OS systems. But, now that I'm used to Hyper-V, I'm really looking for that to work on this laptop without requiring Windows Server 2008R2.
Thankfully Microsoft has the answer coming soon with Windows 8! For those that didn't hear the news yesterday, Microsoft has finally announced that they won't be going version crazy with Windows 8. For traditional Windows computers there will really only be 2 different versions, basic and Pro (there will probably also be an Enterprise version). And from that announcement the Hyper-V functionality will be moved down to the Pro version of Windows 8 (now being referred to as Client Hyper-V)! This should fix many issues that I have had with laptop setup over the years. On some of the Hyper-V blogs it does appear that the issues with using wireless networks in Hyper-V have been resolved, which is great news. They have also been able to remove the restriction on the different power states when running Hyper-V as well (for those that have not installed Windows Server with the Hyper-V role on a laptop, you are currently not allowed to use hibernate or sleep modes if the role is running).
This will greatly simplify my setups on laptops, I have not yet installed the Consumer Preview of Windows 8 to verify all of this. It does look like Windows 8 will be the way to go if you need to run virtual environments on a laptop.
I love to have the newest OS and software available to try out things and for presentations/demos, and like anyone else that does this I prefer to not carry around multiple laptops or connect via sometimes unreliable Internet connections to servers to do this. Over the years I have tried many different versions of virtualization technologies on both Microsoft and Apple platforms. I have used the paid versions like VMWare and Parallels along with the freely available ones like VirtualPC and VirtualBox. While all of these have their pluses and minuses lately I have been using Microsoft's Hyper-V technologies. Currently the biggest downside to using Hyper-V is that it is only available in it's most complete form in Windows Server 2008R2. This is a bit inconvenient for use on a laptop, but I currently have my MacBook Pro setup with BootCamp (Apple's dual-boot technology) to allow me to run both OS X Lion and Windows Server 2008R2 Enterprise without any virtualization. Then I can use Hyper-V in the Windows partition without any issues.
To get my Windows partition setup on the Mac so that it works more like Windows 7 then Windows Server I used the resources on win2008r2workstation.com and Mathieu Chateau's blog to get everything up and running as good as possible (I still have some strange driver issue, but it doesn't prevent any of the regular functions from working correctly). With all of these changes I'm still able to do "normal" things in the Windows partition like run Office and play Star Wars: The Old Republic. I have even been able to take advantage of the Thunderbolt port on my MacBook Pro in Windows to drive the 27" Apple Thunderbolt display and a 1TB LaCie External Thunderbolt harddrive. With the 16GB RAM and the 512GB SSD in my MacBook Pro this setup as worked very well for over the last 18 months.
Now with the new Lenovo laptop that I got from Pragmatic Works I have been trying to figure out a better solution than using VirtualBox. Don't get me wrong, VirtualBox is a great tool and is the only free solution that allows you to run both 32bit and 64bit virtual OS systems. But, now that I'm used to Hyper-V, I'm really looking for that to work on this laptop without requiring Windows Server 2008R2.
Thankfully Microsoft has the answer coming soon with Windows 8! For those that didn't hear the news yesterday, Microsoft has finally announced that they won't be going version crazy with Windows 8. For traditional Windows computers there will really only be 2 different versions, basic and Pro (there will probably also be an Enterprise version). And from that announcement the Hyper-V functionality will be moved down to the Pro version of Windows 8 (now being referred to as Client Hyper-V)! This should fix many issues that I have had with laptop setup over the years. On some of the Hyper-V blogs it does appear that the issues with using wireless networks in Hyper-V have been resolved, which is great news. They have also been able to remove the restriction on the different power states when running Hyper-V as well (for those that have not installed Windows Server with the Hyper-V role on a laptop, you are currently not allowed to use hibernate or sleep modes if the role is running).
This will greatly simplify my setups on laptops, I have not yet installed the Consumer Preview of Windows 8 to verify all of this. It does look like Windows 8 will be the way to go if you need to run virtual environments on a laptop.
Thursday, April 12, 2012
SQL Server 2012 Certifications
For the last 2 weeks I have been taking the new SQL Server 2012 beta exams, to see how things are going to be in the new exams. Since I had to accept an NDA to take the exams I cannot reveal the details of the exams. But I can present my general thoughts. There are quite few new "technologies" being used for some of the questions and I think that those people that have struggled answering certain types of questions will find the new setup much better. My focus was on all of the Business Intelligence exams. Below are the exams that I took (since starting this post all of the 71's have been changed to 70's, see further in this post):
The biggest take away I got from this process was to never schedule taking this many exams over only 7 weekdays. Also, make sure you do pay attention to the amount of time they give you to take each exam. I assumed the length of these exams would be similar to the previous SQL Server 2008 exams and found that there is a lot more reading involved in the new exams. This may change when they go out of beta.
---
Since I started writing this entry last week, Microsoft just released the official plan for the SQL Server certifications. No longer are there the MCTS or MCITP certifications, the new titles are MCSA (Microsoft Certified Solutions Associate) and then MCSE (Microsoft Certified Solutions Expert). MCM (Microsoft Certified Master) has also been renamed to MCSM (Microsoft Certified Solutions Master). MCSE is the specialized level of certifications, everyone that wants to get a SQL Server MCSE will have to pass all of the SQL Server MCSA exams:
Once you get your SQL Server MCSA then you can work towards the MCSE: Data Platform or MCSE: Business Intelligence. In the end the MCSE certification will require you pass a total of 5 exams, but if you want to get both the SQL Server MCSE specialties you will only need to take a total of 7 exams.
There are also going to be a set of upgrade exams offered for existing MCTS and MCITP SQL Server 2008 certifications, if you go to this page, Microsoft Training has done a good job outlining the upgrade path.
Overall this looks like a good change to me, makes the specializations very clear and requires that everyone have the same basic set of skills. All of these new exams should be available in June, so start studying now! Good luck.
Labels:
Certification,
Microsoft BI,
SQL 2012
Monday, April 2, 2012
Getting Started with SQL Server 2012
Now that SQL Server 2012 is generally available more of you may be asking where you can get more information on all of the new features (and also all of the licensing changes).
The first stop is Microsoft’s main SQL page at http://www.microsoft.com/sql. The site has been updated with all of the resources on SQL Server 2012. A couple of other links that are also at the top of that page are worth highlighting as well.
Microsoft held a virtual launch event for SQL Server 2012 on 3/7/2012 and all of the resources from that event are available at: http://www.sqlserverlaunch.com/. There are over 30 pre-recorded sessions covering all of the new features in SQL Server 2012 along with videos of the keynotes presented by Ted Kummert and Quentin Clark of Microsoft.
Microsoft has also made a free ebook, Introducing Microsoft SQL Server 2012 by Ross Mistry and Stacia Misner. The ebook is available in PDF, EPUB and MOBI formats. For more details on the chapters of the book you can go to this blog post from Microsoft Press. This book is also available in hardcopy from O’Reilly.
Now that you have read all about SQL Server 2012, download the free evaluation version or get it from MSDN and TechNet now!
The first stop is Microsoft’s main SQL page at http://www.microsoft.com/sql. The site has been updated with all of the resources on SQL Server 2012. A couple of other links that are also at the top of that page are worth highlighting as well.
Microsoft held a virtual launch event for SQL Server 2012 on 3/7/2012 and all of the resources from that event are available at: http://www.sqlserverlaunch.com/. There are over 30 pre-recorded sessions covering all of the new features in SQL Server 2012 along with videos of the keynotes presented by Ted Kummert and Quentin Clark of Microsoft.
Microsoft has also made a free ebook, Introducing Microsoft SQL Server 2012 by Ross Mistry and Stacia Misner. The ebook is available in PDF, EPUB and MOBI formats. For more details on the chapters of the book you can go to this blog post from Microsoft Press. This book is also available in hardcopy from O’Reilly.
Now that you have read all about SQL Server 2012, download the free evaluation version or get it from MSDN and TechNet now!
Labels:
SQL 2012
Wednesday, March 28, 2012
Installing PowerPivot and Power View
For those that want to have all of the newest SQL BI toys installed in a VM or on your laptop, I have found a couple of really good resources that I use each time I need to install them.
Step by step guide on installing PowerPivot for SharePoint on a single machine
Checklist: Reporting Services, Power View, and PowerPivot for SharePoint
The first link is from PowerPivot-info.com, which is a great resource for PowerPivot. The second is an MSDN article that provides the Microsoft perspective on getting it setup. Both of these resources were posted before SQL 2012 was RTM and have been updated since RTM, so they both work very smoothly.
For those that are new to PowerPivot and Power View and wondering why these tools require detailed installation resources is because both of these tools require SharePoint Server 2010 be installed. For those that have not installed SPS2010 yet, it can be challenging, especially to make sure you get the PowerPivot Gallery installed correctly. Thankfully Power View, the newest SQL BI tool on the block, works very well with PowerPivot, so once you have PowerPivot setup Power View is not too hard to add on.
Have fun and enjoy PowerPivot and Power View!
Step by step guide on installing PowerPivot for SharePoint on a single machine
Checklist: Reporting Services, Power View, and PowerPivot for SharePoint
The first link is from PowerPivot-info.com, which is a great resource for PowerPivot. The second is an MSDN article that provides the Microsoft perspective on getting it setup. Both of these resources were posted before SQL 2012 was RTM and have been updated since RTM, so they both work very smoothly.
For those that are new to PowerPivot and Power View and wondering why these tools require detailed installation resources is because both of these tools require SharePoint Server 2010 be installed. For those that have not installed SPS2010 yet, it can be challenging, especially to make sure you get the PowerPivot Gallery installed correctly. Thankfully Power View, the newest SQL BI tool on the block, works very well with PowerPivot, so once you have PowerPivot setup Power View is not too hard to add on.
Have fun and enjoy PowerPivot and Power View!
Labels:
Microsoft BI,
Power View,
PowerPivot,
SQL 2012
Tuesday, March 27, 2012
New Beginnings, Part 2
Has it already been June of last year since I posted to my blog last? Time flies!
As I mentioned in my previous post I started a new job after 14 years at the same company, which was a very big step for me. After 8 months on the new job I realized it just wasn't exactly what I was promised or looking for at this time in my career. On President's Day weekend I saw a posting in LinkedIn for a Microsoft BI Consultant with Pragmatic Works and jumped at the chance to work for a premier Microsoft Business Intelligence consulting firm. To be honest at the time I didn't think that I had a chance at getting this position, not sure if it was low self-esteem or just feeling that I lacked certain skills, but I decided to try anyway. I was contacted immediately on President's Day and went through both the initial and technical interviews on that day. After a couple of weeks I got an offer from Pragmatic Works!
For those that don't know Pragmatic Works is a very well known Microsoft BI/SQL Server consulting company based in Jacksonville, Florida. The company is built on a core of each employee giving something back to the community, which is how I got to know the company many years before even considering applying to work there. All of the consultants are encouraged to respond to questions in the MSDN and BIDN forums as well as blog and of course present at events throughout the world. This is one of the keys to why I wanted to work for Pragmatic Works, as I have started to contribute more of my time to the SQL Server community and wanted to make that as much a part of my career as everything else I do.
So, you may be asking, does that mean Steve is relocating to Jacksonville? Not necessary! Pragmatic Works is growing so fast and has clients across the country, having me in the Denver area is actually a great asset for them. When I am not traveling on client site, I will be able to work from home. This presented another new challenge for me since I did not have a home office at the house before now. Thankfully we still had a spare bedroom, which I have now converted into a great home office where I can look out and see the mountains.
I am very thankful for all of my past accomplishments because they would not have prepared me for the amazing opportunities that I have now and I look forward to all of the new challenges that will be coming.
This should be the start of new entries in this blog on a very regular basis going forward. so stay tuned!
As I mentioned in my previous post I started a new job after 14 years at the same company, which was a very big step for me. After 8 months on the new job I realized it just wasn't exactly what I was promised or looking for at this time in my career. On President's Day weekend I saw a posting in LinkedIn for a Microsoft BI Consultant with Pragmatic Works and jumped at the chance to work for a premier Microsoft Business Intelligence consulting firm. To be honest at the time I didn't think that I had a chance at getting this position, not sure if it was low self-esteem or just feeling that I lacked certain skills, but I decided to try anyway. I was contacted immediately on President's Day and went through both the initial and technical interviews on that day. After a couple of weeks I got an offer from Pragmatic Works!
For those that don't know Pragmatic Works is a very well known Microsoft BI/SQL Server consulting company based in Jacksonville, Florida. The company is built on a core of each employee giving something back to the community, which is how I got to know the company many years before even considering applying to work there. All of the consultants are encouraged to respond to questions in the MSDN and BIDN forums as well as blog and of course present at events throughout the world. This is one of the keys to why I wanted to work for Pragmatic Works, as I have started to contribute more of my time to the SQL Server community and wanted to make that as much a part of my career as everything else I do.
So, you may be asking, does that mean Steve is relocating to Jacksonville? Not necessary! Pragmatic Works is growing so fast and has clients across the country, having me in the Denver area is actually a great asset for them. When I am not traveling on client site, I will be able to work from home. This presented another new challenge for me since I did not have a home office at the house before now. Thankfully we still had a spare bedroom, which I have now converted into a great home office where I can look out and see the mountains.
I am very thankful for all of my past accomplishments because they would not have prepared me for the amazing opportunities that I have now and I look forward to all of the new challenges that will be coming.
This should be the start of new entries in this blog on a very regular basis going forward. so stay tuned!
Labels:
Microsoft BI,
Pragmatic Works
Subscribe to:
Posts (Atom)