gdata.io.handleScriptLoaded({"version":"1.0","encoding":"UTF-8","feed":{"xmlns":"http://www.w3.org/2005/Atom","xmlns$openSearch":"http://a9.com/-/spec/opensearchrss/1.0/","xmlns$gd":"http://schemas.google.com/g/2005","xmlns$georss":"http://www.georss.org/georss","xmlns$thr":"http://purl.org/syndication/thread/1.0","xmlns$blogger":"http://schemas.google.com/blogger/2008","id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358"},"updated":{"$t":"2023-11-24T02:35:19.340-06:00"},"category":[{"term":"Licensing"},{"term":"Hardware"},{"term":"SSMS"},{"term":"Configuration"},{"term":"Security"},{"term":"Replication"},{"term":"Linux"},{"term":"Backup/Recovery"},{"term":"Azure"},{"term":"SSIS"},{"term":"Github"},{"term":"sqlcmd"},{"term":"Maintenance"},{"term":"SQL Agent"},{"term":"Monitoring"},{"term":"Visual Studio"},{"term":"Self Learning"},{"term":"Troubleshooting"},{"term":"Performance Tuning"},{"term":"SSRS"},{"term":"PowerShell"},{"term":"Script"}],"title":{"type":"text","$t":"Travis Gan"},"subtitle":{"type":"html","$t":"Technical blog on SQL Server, BI Stack, Azure and other technologies"},"link":[{"rel":"http://schemas.google.com/g/2005#feed","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default?alt\u003djson-in-script\u0026orderby\u003dpublished"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default?alt\u003djson-in-script\u0026orderby\u003dpublished"},{"rel":"alternate","type":"text/html","href":"http://www.travisgan.com/"},{"rel":"hub","href":"http://pubsubhubbub.appspot.com/"},{"rel":"next","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default?alt\u003djson-in-script\u0026start-index\u003d26\u0026max-results\u003d25\u0026orderby\u003dpublished"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"generator":{"version":"7.00","uri":"https://draft.blogger.com","$t":"Blogger"},"openSearch$totalResults":{"$t":"87"},"openSearch$startIndex":{"$t":"1"},"openSearch$itemsPerPage":{"$t":"25"},"entry":[{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-1852317416927573807"},"published":{"$t":"2022-11-23T23:40:00.041-06:00"},"updated":{"$t":"2023-01-10T23:59:02.613-06:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"Self Learning"}],"title":{"type":"text","$t":"Microsoft Certified"},"content":{"type":"html","$t":"\u003cdiv class\u003d\"separator\" style\u003d\"clear: both;\"\u003e\u003ca href\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg_wJD1CstaQ_9F70LSCAeOdKpodgF_L7dmRwzBbVLpQDgjwrTChKUta9FhVKIv2B-NPd5mjhBH3uLx7LJKUNpQ6ly-mz5QbNHGwR4VL_d1NiwZBcwonUc2Qfoypg0bs0KpN6jqq_RpCZL1Uq9AXgc89d8TWiL2Di1mxSm1VVVjn4voxIetWX2cDSgb1w/s367/mscer.png\" style\u003d\"clear: left; display: block; float: left; padding: 1em 0px; text-align: center;\"\u003e\u003cimg alt\u003d\"\" border\u003d\"0\" data-original-height\u003d\"137\" data-original-width\u003d\"367\" src\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg_wJD1CstaQ_9F70LSCAeOdKpodgF_L7dmRwzBbVLpQDgjwrTChKUta9FhVKIv2B-NPd5mjhBH3uLx7LJKUNpQ6ly-mz5QbNHGwR4VL_d1NiwZBcwonUc2Qfoypg0bs0KpN6jqq_RpCZL1Uq9AXgc89d8TWiL2Di1mxSm1VVVjn4voxIetWX2cDSgb1w/s320/mscer.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\nIt has been a while since I last pursued a \u003ca href\u003d\"https://www.travisgan.com/2015/12/microsoft-recertification.html\"\u003eMicrosoft certification\u003c/a\u003e. \n\u003cbr /\u003e\u003cbr /\u003e\nAfter a few years working as a developer, I focused primarily on SQL Server data platform and business intelligence. I learned from work experiences, Microsoft articles as well as various blogs from the awesome SQL Server communities. To test my level of knowledge and exposure in this field, I also took the challenge of certification for Microsoft Certified Solution Expert for Data Platform and Business Intelligence.\n\u003cbr /\u003e\u003cbr /\u003e\nAs my role at work continue to evolve, I have been wearing multiple hats and at time switch hat. I found myself heavily involved in Azure AD, Okta, Azure cloud resources, Azure Devops, Github, App / API Gateway, etc.. in addition to SQL Server administration. \n\u003cbr /\u003e\u003cbr /\u003e\n\u003ca name\u003d'more'\u003e\u003c/a\u003eSo again, I thought it might be a good idea to take up the challenge to the certification in these areas to gauge my knowledge and learn something new that I might not have exposed to previously. At the same time, also reap the benefit of increasing credibility to existing or new employers. Win win.\n\u003cbr /\u003e\u003cbr /\u003e\nFor those who have not taken Microsoft certification for a while, Microsoft has make some changes on their websites and certification path. Microsoft now also provided free online study material in addition to instructor-led class. You can find the newest information at this \u003ca href\u003d\"https://learn.microsoft.com/en-us/certifications/\" target\u003d\"_blank\"\u003eMicrosoft Learn certification site\u003c/a\u003e. In addition to this, at the moment, Microsoft also offers \u003ca href\u003d\"https://esi.microsoft.com/\" target\u003d\"_blank\"\u003eEnterprise Skills Initiative\u003c/a\u003e\u0026nbsp;to organization for their employees to learn and take the certification examination for free.\n\u003cbr /\u003e\u003cbr /\u003e\nI reviewed different Microsoft certifications and here are 3 that I felt that I have the experience or/and interested in.\n\u003cbr /\u003e\u003cbr /\u003e\n\u003cb\u003eMicrosoft Azure Administrator\u003c/b\u003e (\u003ca href\u003d\"https://learn.microsoft.com/en-us/certifications/azure-administrator/\"\u003emore details\u003c/a\u003e) \u003cbr /\u003e \nManage Azure identities and governance\u003cbr /\u003e\nImplement and manage storage\u003cbr /\u003e\nDeploy and manage Azure compute resources\u003cbr /\u003e\nConfigure and manage virtual networking\u003cbr /\u003e\nMonitor and maintain Azure resources\n\u003cbr /\u003e\u003cbr /\u003e\n\n\u003cb\u003eAzure Solutions Architect Expert\u003c/b\u003e (\u003ca href\u003d\"https://learn.microsoft.com/en-us/certifications/azure-solutions-architect/\"\u003emore details\u003c/a\u003e) \u003cbr /\u003e\nDesign identity, governance, and monitoring solutions\u003cbr /\u003e\nDesign data storage solutions\u003cbr /\u003e\nDesign business continuity solutions\u003cbr /\u003e\nDesign infrastructure solutions\n\u003cbr /\u003e\u003cbr /\u003e\n\n\u003cb\u003eDevOps Engineer Expert\u003c/b\u003e (\u003ca href\u003d\"https://learn.microsoft.com/en-us/certifications/devops-engineer/\"\u003emore details\u003c/a\u003e) \u003cbr /\u003e \nDevelop an instrumentation strategy\u003cbr /\u003e\nDevelop a Site Reliability Engineering (SRE) strategy\u003cbr /\u003e\nDevelop a security and compliance plan\u003cbr /\u003e\nManage source control\u003cbr /\u003e\nFacilitate communication and collaboration\u003cbr /\u003e\nDefine and implement continuous integration\u003cbr /\u003e\nDefine and implement a continuous delivery and release management strategy\n\u003cbr /\u003e\u003cbr /\u003e\n\nAfter few months of study and exploration, I am glad that in addition to the MSCE for Data Platform and Business Intelligence, I am now certified as Azure Administrator Associate, Azure Solutions Architect Expert and DevOps Engineer Expert.\n\u003cbr /\u003e\u003cbr /\u003e\nHappy holiday!"},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/1852317416927573807/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2022/11/microsoft-certified.html#comment-form","title":"0 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/1852317416927573807"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/1852317416927573807"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2022/11/microsoft-certified.html","title":"Microsoft Certified"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"media$thumbnail":{"xmlns$media":"http://search.yahoo.com/mrss/","url":"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg_wJD1CstaQ_9F70LSCAeOdKpodgF_L7dmRwzBbVLpQDgjwrTChKUta9FhVKIv2B-NPd5mjhBH3uLx7LJKUNpQ6ly-mz5QbNHGwR4VL_d1NiwZBcwonUc2Qfoypg0bs0KpN6jqq_RpCZL1Uq9AXgc89d8TWiL2Di1mxSm1VVVjn4voxIetWX2cDSgb1w/s72-c/mscer.png","height":"72","width":"72"},"thr$total":{"$t":"0"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-5778135876911825064"},"published":{"$t":"2022-11-02T07:30:00.020-05:00"},"updated":{"$t":"2022-11-02T23:40:08.562-05:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"PowerShell"},{"scheme":"http://www.blogger.com/atom/ns#","term":"Azure"}],"title":{"type":"text","$t":"Access Office 365 Exchange Online Mailbox with Client Credential Flow"},"content":{"type":"html","$t":"\u003cdiv class\u003d\"separator\" style\u003d\"clear: both;\"\u003e\u003ca href\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgY4u6bDrWYRgvIDY0NHB1U7fOSbiFsxmU1DqWnBjHU2WT9vgHvQtHvpp6ZIM6Rj3Mq5TOwmsgs4TbY61znomd51bMn0-P3h_oGRCgd-MlNJJjyeuCZdaTlDEHeKe437LiQtDW0k_WUvlAr79E-z5cbAuiddxljc-7JDykJ1oq0h5XfXdcOtzFNrS9HsA/s290/mail-oauth0.jpg\" style\u003d\"clear: left; display: block; float: left; padding: 0em 15px; text-align: center;\"\u003e\u003cimg alt\u003d\"\" border\u003d\"0\" data-original-height\u003d\"174\" data-original-width\u003d\"290\" src\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgY4u6bDrWYRgvIDY0NHB1U7fOSbiFsxmU1DqWnBjHU2WT9vgHvQtHvpp6ZIM6Rj3Mq5TOwmsgs4TbY61znomd51bMn0-P3h_oGRCgd-MlNJJjyeuCZdaTlDEHeKe437LiQtDW0k_WUvlAr79E-z5cbAuiddxljc-7JDykJ1oq0h5XfXdcOtzFNrS9HsA/s320/mail-oauth0.jpg\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\nAs of this writing, Microsoft has disabled the basic authentication to Exchange Online, the alternate more secure method to authenticate and access Exchange Online is via \u003ca href\u003d\"https://learn.microsoft.com/en-us/azure/active-directory/develop/v2-oauth2-client-creds-grant-flow\" target\u003d\"_blank\"\u003eClient Credential flow\u003c/a\u003e using OAuth token. OAuth (OIDC) Client Credential flow is typically used for background (eg. windows services or daemon) process without interaction with user.\n\u003cbr /\u003e\u003cbr /\u003e\nMicrosoft provides different APIs to access Exchange Online. To authenticate against these APIs, we need OAuth access token with appropriate permissions when invoking these APIs. And this type of access token is obtained from Azure AD. \n\u003cbr /\u003e\u003cbr /\u003e\n\u003cspan\u003e\u003ca name\u003d'more'\u003e\u003c/a\u003e\u003c/span\u003e\nA little background, in order to access Microsoft resource eg. Azure, Exchange Online, SharePoint via Microsoft APIs, the entry point via an identity (or service principal). In this case, it would be by creating an Azure AD app registration. \n\u003cbr /\u003e\u003cbr /\u003e\nIn Azure AD, an API is granted either with delegated permission or application permission. When using Client Credential flow in Azure AD, the access token is acquired as application identity with only the application permission (delegated permission is not used since there is no user). The scope used during access token request is with *./default scope, for example https://graph.microsoft.com/.default or https://outlook.office365.com/.default depending on which API you are using.\n\u003cbr /\u003e\u003cbr /\u003eTo configure the access restriction of the application, we will need the Exchange Online PowerShell module (ExchangeOnlineManagement). It is the Microsoft Exchange Online administrative interface to Microsoft Exchange Online. To install the PowerShell module,\n\u003cpre class\u003d\"brush:ps\"\u003eInstall-Module -Name ExchangeOnlineManagement -Scope CurrentUser\u003c/pre\u003e\nConnect to the organization Microsoft Exchange Online, enter the credential in the interactive prompt\n\u003cpre class\u003d\"brush:ps\"\u003eConnect-ExchangeOnline\u003c/pre\u003e\nDepending if your account has the appropriate role in the organization eg. global admin, exchange admin, etc. it appears to subsequently downloads dynamic module to user path like this, C:\\Users\\abc\\AppData\\Local\\Temp\\tmpEXO_xyzabc10.a12\\ where the module seems to contain similar cmdlets like the ones in PowerShell module. \n\u003cbr /\u003e\u003cbr /\u003e\nTo locate the dynamic module, you can lookup some cmdlet eg. \n\u003cpre class\u003d\"brush:ps\"\u003eGet-Command Get-ServicePrincipal\u003c/pre\u003e\nWith all the tools and some background covered, here are the two examples of using OAuth to access Office 365 Exchange Online mailbox. One using Legacy Office 365 Exchange Online API to access and read mails of a mailbox via IMAP/POP, and another using Graph API to send mail as the user. Please be aware these are not the only ways, but simply an example. Microsoft provide several other ways to achieve to read and send mails with OAuth.\n\u003cbr /\u003e\u003cbr /\u003e\nIn order to access the APIs, we need a valid access token with appropriate permission (scope) from Azure AD. First we need to create a application service principal via app registration.\n\u003cbr /\u003e\u003cbr /\u003e\nLogin to Azure portal, go to Azure Active Directory. Select app registration and select new registration.\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both;\"\u003e\u003ca href\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj9UKjFVDbppup0IERkjr2-yLiUyzg4bqMiJuw53PBYaX2ZOfhDj4qO54F3KjLKcAKh2avx-jbt2pkr7cvU58_KQkqNlm6B7Gi2ZzHccIcd3rwssJf_4ylbHVEbjkcXdd1nzvEe14zwegGyZ6xnR4rtmqbjLgOfyV6q1Ji0t25HmKC9G7d0A5OedZteKw/s1600/mail-oauth1.png\" style\u003d\"display: block; padding: 1em 0px; text-align: center;\"\u003e\u003cimg alt\u003d\"\" border\u003d\"0\" data-original-height\u003d\"293\" data-original-width\u003d\"919\" src\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj9UKjFVDbppup0IERkjr2-yLiUyzg4bqMiJuw53PBYaX2ZOfhDj4qO54F3KjLKcAKh2avx-jbt2pkr7cvU58_KQkqNlm6B7Gi2ZzHccIcd3rwssJf_4ylbHVEbjkcXdd1nzvEe14zwegGyZ6xnR4rtmqbjLgOfyV6q1Ji0t25HmKC9G7d0A5OedZteKw/s1600/mail-oauth1.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\nSince the application will be using client credential flow, the default option without any redirect URI is sufficient.\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both;\"\u003e\u003ca href\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj0mS1tPh45Gm4o6vzcJT32pu9A3cS_je2nm7vM7R9DCr23go-26XGplwEku2ffB4yaugATfGWj98BqSNdZO65EgFzNX7_nIBXinMan8Vt0dwVABY2Kp2NOMYcIUfoksga_w5j9YnbvH3D2qBkfOsf94Tr53-QnSt7As73Y-fTpLNxkmQ1NvUo0RdSXJw/s1600/mail-oauth2.png\" style\u003d\"display: block; padding: 1em 0px; text-align: center;\"\u003e\u003cimg alt\u003d\"\" border\u003d\"0\" data-original-height\u003d\"664\" data-original-width\u003d\"763\" src\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj0mS1tPh45Gm4o6vzcJT32pu9A3cS_je2nm7vM7R9DCr23go-26XGplwEku2ffB4yaugATfGWj98BqSNdZO65EgFzNX7_nIBXinMan8Vt0dwVABY2Kp2NOMYcIUfoksga_w5j9YnbvH3D2qBkfOsf94Tr53-QnSt7As73Y-fTpLNxkmQ1NvUo0RdSXJw/s1600/mail-oauth2.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\nNext we will need to create a Client Secret. Select certificates \u0026amp; secrets on the left menu and click on Client secrets and create a new client secret. Keep the Client Secret for later use.\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both;\"\u003e\u003ca href\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhlmfxTTvDrV7huKGl_GoyLF6o-GyLS2fgWNpJAuws14bhstWb-0MnxhETYUyIuo0M4fq_DIVGpeJvv6uFzJs5-n_TgMQB3mGCCvnePj4rfhydEs82noYNtaUMHX8P2A-Bf-21A0y0QxgU8wKbuhq4tuDRMr4uszO6TWAP9N9LIDA0ggObrltv1zpGh2A/s830/mail-oauth4.png\" style\u003d\"display: block; padding: 1em 0px; text-align: center;\"\u003e\u003cimg alt\u003d\"\" border\u003d\"0\" data-original-height\u003d\"642\" data-original-width\u003d\"830\" src\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhlmfxTTvDrV7huKGl_GoyLF6o-GyLS2fgWNpJAuws14bhstWb-0MnxhETYUyIuo0M4fq_DIVGpeJvv6uFzJs5-n_TgMQB3mGCCvnePj4rfhydEs82noYNtaUMHX8P2A-Bf-21A0y0QxgU8wKbuhq4tuDRMr4uszO6TWAP9N9LIDA0ggObrltv1zpGh2A/s320/mail-oauth4.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\nNext we need to grant API permission to the application. Select API Permissions on the left menu, and add a permission. \n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both;\"\u003e\u003ca href\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEirMx2p1k3JppaL4Th2axKWYKNY0MRW_t_EA40rxHmjnHwG6094gIgKw6gg0TVtjWqHzAI5Us6_3XDQ3Gesp_Qrt_bt6cwc-MCm2J6-oqqOpg9gR8PmcL-sUvi3Ya1LUq1p5HtM3xspn5oPfusvyCHb4Jwgyn-NuiphHdiuljO1lZTd5V3LHYVIxOrR5Q/s861/mail-oauth5.png\" style\u003d\"display: block; padding: 1em 0px; text-align: center;\"\u003e\u003cimg alt\u003d\"\" border\u003d\"0\" data-original-height\u003d\"676\" data-original-width\u003d\"861\" src\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEirMx2p1k3JppaL4Th2axKWYKNY0MRW_t_EA40rxHmjnHwG6094gIgKw6gg0TVtjWqHzAI5Us6_3XDQ3Gesp_Qrt_bt6cwc-MCm2J6-oqqOpg9gR8PmcL-sUvi3Ya1LUq1p5HtM3xspn5oPfusvyCHb4Jwgyn-NuiphHdiuljO1lZTd5V3LHYVIxOrR5Q/s320/mail-oauth5.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\nFor Graph API, select Microsoft Graph, click on Application Permissions.\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both;\"\u003e\u003ca href\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh4P4rUfGTa7Mhj7QJiNqYX6hUNWIP-yb4eMLVQehSUPMfEkEeOLRGfW0d0w4uVqkkX7dG9nQ8ja3n2cCxIB9mZcj_LWcQmxy5_6YlWmEuyFORwtn2gghXDI9vcOB_hBHgo4Aui6FLUcBr9TQSSApvTW3X3YRjIqJKy5e3VX_GYweOMXVmtGOH7aKGHjw/s1248/mail-oauth6.png\" style\u003d\"display: block; padding: 1em 0px; text-align: center;\"\u003e\u003cimg alt\u003d\"\" border\u003d\"0\" data-original-height\u003d\"396\" data-original-width\u003d\"1248\" src\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh4P4rUfGTa7Mhj7QJiNqYX6hUNWIP-yb4eMLVQehSUPMfEkEeOLRGfW0d0w4uVqkkX7dG9nQ8ja3n2cCxIB9mZcj_LWcQmxy5_6YlWmEuyFORwtn2gghXDI9vcOB_hBHgo4Aui6FLUcBr9TQSSApvTW3X3YRjIqJKy5e3VX_GYweOMXVmtGOH7aKGHjw/s320/mail-oauth6.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\nSearch for mail, and check Mail.Send. Notice that the permission will allow the application to send mail as any user. We will restrict its access via application access policy described later.\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both;\"\u003e\u003ca href\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgBIvRfByCTh06muzceXokw46TH6Ft1HulGpMTwnMEjtG6hhpdY7Kq7z2rTF9R1_-5scRdA_IH8BE2cNOY00mz3zOCVyj6fo_52pWh1wzMaTHMj0y11bEttO7CCZs0SMum9QtOL4watNDQWLpfHC1BbLDz41apWa1d97fXwQan9G2_ZgdBHxiZLTMSDKA/s1138/mail-oauth7.png\" style\u003d\"display: block; padding: 1em 0px; text-align: center;\"\u003e\u003cimg alt\u003d\"\" border\u003d\"0\" data-original-height\u003d\"719\" data-original-width\u003d\"1138\" src\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgBIvRfByCTh06muzceXokw46TH6Ft1HulGpMTwnMEjtG6hhpdY7Kq7z2rTF9R1_-5scRdA_IH8BE2cNOY00mz3zOCVyj6fo_52pWh1wzMaTHMj0y11bEttO7CCZs0SMum9QtOL4watNDQWLpfHC1BbLDz41apWa1d97fXwQan9G2_ZgdBHxiZLTMSDKA/s320/mail-oauth7.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\nFor Office 365 Exchange Online API, select APIs my organization uses tab, search for office and select Office 365 Exchange Online. click on Application Permissions. \n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both;\"\u003e\u003ca href\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhgm3N0Tw2vZRmwFZOKQiF5r9Q4C2dqNArXfbgjEOKdynAgYq3pDK-NN-48iqgIV-u3LCO-yQ20MbXLOcuJxzKBdZY09ZH3jxoYRjXr4Fn1ieh0DtynrmqWBS-8mlMw72jf5inlHSaceNBJGA7eVjtzp3V9_fD6QEEuLkISsaN6WgdwP6_wJTe28ZyE4g/s617/mail-oauth8.png\" style\u003d\"display: block; padding: 1em 0px; text-align: center;\"\u003e\u003cimg alt\u003d\"\" border\u003d\"0\" data-original-height\u003d\"513\" data-original-width\u003d\"617\" src\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhgm3N0Tw2vZRmwFZOKQiF5r9Q4C2dqNArXfbgjEOKdynAgYq3pDK-NN-48iqgIV-u3LCO-yQ20MbXLOcuJxzKBdZY09ZH3jxoYRjXr4Fn1ieh0DtynrmqWBS-8mlMw72jf5inlHSaceNBJGA7eVjtzp3V9_fD6QEEuLkISsaN6WgdwP6_wJTe28ZyE4g/s320/mail-oauth8.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\nSearch for and check IMAP.AccessAsApp (if you need to access via POP, check POP.AccessAsApp). \n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both;\"\u003e\u003ca href\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgQHrUHQHc1D6Rrtp5n7K4QZsof9GU4fy9uElknZ_2dr_Bi6kWS_nfl7IBxKaoGzSev8yPmWRcuvhAcFXO-_twVQFGsJxllK7sR_HOpp0GzCh9-QM3N_b44fCWrF82bfx4hLdX4-pFwxtXnpUdfSEZ8IpPaxlfUtBQU9YazewWd173bZY_biaOgl7RC8w/s1232/mail-oauth9.png\" style\u003d\"display: block; padding: 1em 0px; text-align: center;\"\u003e\u003cimg alt\u003d\"\" border\u003d\"0\" data-original-height\u003d\"735\" data-original-width\u003d\"1232\" src\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgQHrUHQHc1D6Rrtp5n7K4QZsof9GU4fy9uElknZ_2dr_Bi6kWS_nfl7IBxKaoGzSev8yPmWRcuvhAcFXO-_twVQFGsJxllK7sR_HOpp0GzCh9-QM3N_b44fCWrF82bfx4hLdX4-pFwxtXnpUdfSEZ8IpPaxlfUtBQU9YazewWd173bZY_biaOgl7RC8w/s320/mail-oauth9.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\nGrant admin consent to the permissions added.\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both;\"\u003e\u003ca href\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgF44MZBMy0KNfK6Z7euJtrcxwZUZJlkFHydlhruqRtleCpS0NZ_U1rL9yWOmZA1WPCCB7sxrfpyrEiYaZwgU811Rouc-EkO1IOksNkrK8uqHAqOmTxUIJ59wjfx8QnkYONIZ5Lpi5Gy7E9XQAM5VaPEqVtUPq5LyKuzmNYaxjwPNTN4Oo1_T1BGG5TkA/s1851/mail-oauth10.png\" style\u003d\"display: block; padding: 1em 0px; text-align: center;\"\u003e\u003cimg alt\u003d\"\" border\u003d\"0\" data-original-height\u003d\"718\" data-original-width\u003d\"1851\" src\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgF44MZBMy0KNfK6Z7euJtrcxwZUZJlkFHydlhruqRtleCpS0NZ_U1rL9yWOmZA1WPCCB7sxrfpyrEiYaZwgU811Rouc-EkO1IOksNkrK8uqHAqOmTxUIJ59wjfx8QnkYONIZ5Lpi5Gy7E9XQAM5VaPEqVtUPq5LyKuzmNYaxjwPNTN4Oo1_T1BGG5TkA/s320/mail-oauth10.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\nWe need to get the ApplicationID (ClientID) and ObjectID of the application service principal for later use. Go to Enterprise Applications on the left menu\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both;\"\u003e\u003ca href\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgvxg5yYV6zH33YBYxibq8JbgkkoNSZrEuzuiRjgAzyMnY1q23rOu5qBipl0Mdspi3RF_DGEqu233wLAaCm8DoLUVsE8MrkYqdNzGctjJk7dhrmVr_GmVnoB85t6LIG12wIc8bG9inzIEVV_AvAnftkO_uUZwLb_FjyRvI4_hrD-b_WkTC8IFtH1LdRow/s436/mail-oauth11.png\" style\u003d\"display: block; padding: 1em 0px; text-align: center;\"\u003e\u003cimg alt\u003d\"\" border\u003d\"0\" data-original-height\u003d\"436\" data-original-width\u003d\"341\" height\u003d\"320\" src\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgvxg5yYV6zH33YBYxibq8JbgkkoNSZrEuzuiRjgAzyMnY1q23rOu5qBipl0Mdspi3RF_DGEqu233wLAaCm8DoLUVsE8MrkYqdNzGctjJk7dhrmVr_GmVnoB85t6LIG12wIc8bG9inzIEVV_AvAnftkO_uUZwLb_FjyRvI4_hrD-b_WkTC8IFtH1LdRow/s320/mail-oauth11.png\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\nSearch for the application and note down the ObjectID and ApplicationID.\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both;\"\u003e\u003ca href\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEipGYcdFx2i_qsVJS5lquV2ZFLRuf0B5W5wCGGzKF_BClHJZ76MQxymuO2ibCISzavllQCVd3cqO-lm71W07Ro7bO1P_fgB7thgvvzmb1O4e2QOZTfCRR2BEXM_gbsPLCDZyk4CilmLNVpY-1axeXO8woT2-7Mu22d8QYZUeXA5zNMHrCMTA4C2yG0oQw/s1425/mail-oauth12.png\" style\u003d\"display: block; padding: 1em 0px; text-align: center;\"\u003e\u003cimg alt\u003d\"\" border\u003d\"0\" data-original-height\u003d\"484\" data-original-width\u003d\"1425\" src\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEipGYcdFx2i_qsVJS5lquV2ZFLRuf0B5W5wCGGzKF_BClHJZ76MQxymuO2ibCISzavllQCVd3cqO-lm71W07Ro7bO1P_fgB7thgvvzmb1O4e2QOZTfCRR2BEXM_gbsPLCDZyk4CilmLNVpY-1axeXO8woT2-7Mu22d8QYZUeXA5zNMHrCMTA4C2yG0oQw/s320/mail-oauth12.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\nThe app registration is now completed. If needed, you can verify if the access token could be obtained. It is important to know ahead which API you are trying to invoke so you can obtain an access token with correct audience (aud claim in the JWT). For Graph API, use https://graph.microsoft.com/.default as the scope; And for Office 365 Exchange Online API, use https://outlook.office365.com/.default as the scope. Here is an example acquiring an access token for Graph API usage.\u003cbr /\u003e\n\u003cpre class\u003d\"brush:ps\"\u003e$tenantid \u003d 'your tenant id'\n$clientid \u003d 'your app client id'\n$clientsecret \u003d 'your app client secret'\n\n$token \u003d Invoke-RestMethod -Method Post -ContentType 'application/x-www-form-urlencoded' -Uri \"https://login.microsoftonline.com/$tenantid/oauth2/v2.0/token\" -Body @{\n    client_id \u003d $clientid\n    client_secret \u003d $clientsecret\n    scope \u003d 'https://graph.microsoft.com/.default'\n    grant_type \u003d 'client_credentials'\n}\n\n$token.access_token\n\u003c/pre\u003e\n\nYou can copy/paste to view the decoded access token (JWT) \u003ca href\u003d\"https://jwt.ms\" target\u003d\"_blank\"\u003ehere\u003c/a\u003e. See the below example of the aud claim and scope with Mail.Send role (permission) configured earlier.\n\u003cbr/\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both;\"\u003e\u003ca href\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhiVAQ5CK5O0ZwElmRAuwZjLOqFOK-NM_32cX0n57WtFJySlB4W8pSd2fLDcSbogiu-jpo60mfuBO_cVl6Gnf3VMBSE4ywKP0b1GkCTp0xITKQvsW04N9AEIXXoXKMLygtsqpcQWJzXgCvmHs6hS3_JY4qWa1h1-3CrVLBzk6s1r06EZ-0X8mL-57IGRA/s976/mail-oauth13.png\" style\u003d\"display: block; padding: 1em 0; text-align: center; \"\u003e\u003cimg alt\u003d\"\" border\u003d\"0\" width\u003d\"320\" data-original-height\u003d\"830\" data-original-width\u003d\"976\" src\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhiVAQ5CK5O0ZwElmRAuwZjLOqFOK-NM_32cX0n57WtFJySlB4W8pSd2fLDcSbogiu-jpo60mfuBO_cVl6Gnf3VMBSE4ywKP0b1GkCTp0xITKQvsW04N9AEIXXoXKMLygtsqpcQWJzXgCvmHs6hS3_JY4qWa1h1-3CrVLBzk6s1r06EZ-0X8mL-57IGRA/s320/mail-oauth13.png\"/\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cb\u003eLegacy Office 365 Exchange Online APIs\u003c/b\u003e\u003cbr /\u003e\nIn the API permission, we granted IMAP.AccessAsApp or POP.AccessAsApp access to the application service principal. This allows the application to access IMAP or POP. However, the application can't access any mailbox yet. And to allow it to access specific mailbox, we first need to add the service principal in exchange online and then grant that service principal permission (delegation) to the mailbox.\n\u003cbr /\u003e\u003cbr /\u003e\nCreate an exchange service principal. Use the AppId and ObjectId from the Azure AD application registration.\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:ps\"\u003eNew-ServicePrincipal -AppId c33aa7**-****-****-****-******ffca0d -ServiceId c21b49**-****-****-****-******f33f47 -Organization 'tenant id' -DisplayName mailtest\n\u003c/pre\u003e\nTo grant permission for service principal to a mailbox (add delegation),\n\u003cpre class\u003d\"brush:ps\"\u003eAdd-MailboxPermission -Identity \"usera@emaildomain.com\" -User c21b49**-****-****-****-******f33f47 -AccessRights FullAccess\n\u003c/pre\u003e\nTo check the existing permission (delegation) on a mailbox,\n\u003cpre class\u003d\"brush:ps\"\u003eGet-MailboxPermission -Identity \"usera@emaildomain.com\"\n\u003c/pre\u003e\nMore information to configure for IMAP/POP can be found \u003ca href\u003d\"https://learn.microsoft.com/en-us/exchange/client-developer/legacy-protocols/how-to-authenticate-an-imap-pop-smtp-application-by-using-oauth#register-service-principals-in-exchange\" target\u003d\"_blank\"\u003ehere\u003c/a\u003e.\n\u003cbr /\u003e\u003cbr /\u003e\nTo verify the IMAP access with the application, you can use this \u003ca href\u003d\"https://github.com/DanijelkMSFT/ThisandThat/blob/main/Get-IMAPAccessToken.ps1\" target\u003d\"_blank\"\u003ePowerShell script\u003c/a\u003e written by one of the Microsoft employee for verification. \n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:ps\"\u003e\n$tenantid \u003d 'tenant id'\n$clientid \u003d 'c21b49**-****-****-****-******f33f47'\n$clientsecret \u003d '***************************'\n$maibox \u003d 'usera@emaildomain.com'\n\n.\\Get-IMAPAccessToken.ps1 -tenantID $tenantid -clientId $cliendid -clientsecret $clientsecret -targetMailbox $mailbox\n\u003c/pre\u003e\n\u003cbr/\u003e\n\u003cb\u003eGraph API\u003c/b\u003e\u003cbr /\u003e\nFor Graph API, understand that granting API Application Permission usually allow it access to all tenant resources. For example, Mail.Send permission allows it to send mail as any user; Mail.Read permission allows it to read mail in all mailboxes. To restrict it to specific mailbox / security group, creates an Application Access Policy.\n\u003cbr /\u003e\u003cbr /\u003e\nNote, as of this writing, if there is no application access policy existed, the get application access policy cmdlet returns an error.\n\u003cpre class\u003d\"brush:ps\"\u003eGet-ApplicationAccessPolicy\u003c/pre\u003e\n\u003cspan style\u003d\"color: red;\"\u003eWrite-ErrorMessage : Ex6F9304|Microsoft.Exchange.Configuration.Tasks.ManagementObjectNotFoundException|The operation\ncouldn't be performed because object 'OU\u003dabcefg.onmicrosoft.com,OU\u003dMicrosoft Exchange Hosted\nOrganizations,DC\u003dABCDE123456,DC\u003dPROD,DC\u003dOUTLOOK,DC\u003dCOM\\*' couldn't be found on\n'AABBCC112233000.ABCDE123456.PROD.OUTLOOK.COM'.\nAt C:\\Users\\abc\\AppData\\Local\\Temp\\tmpEXO_xyzabc10.a12\\tmpEXO_xyzabc10.a12.psm1:1113 char:13\n+             Write-ErrorMessage $ErrorObject $IsFromBatchingRequest\n+             ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n    + CategoryInfo          : NotSpecified: (:) [Get-ApplicationAccessPolicy], ManagementObjectNotFoundException\n    + FullyQualifiedErrorId : [Server\u003dABCDEFGHIJ123,RequestId\u003dabcd-efgh-3e96-e79a-123456789012,TimeStamp\u003dSat, 31 S\n   ep 2022 25:02:47 GMT],Write-ErrorMessage\u003c/span\u003e  \n\u003cbr /\u003e\n\u003cbr/\u003e\nTo add application access policy to restrict application access to certain mailbox or security group, use the AppId during the Azure AD application registration. The PolicyScopeGroupId is the mailbox or security group that can be only access by the app.\n\u003cpre class\u003d\"brush:ps\"\u003eNew-ApplicationAccessPolicy -AccessRight RestrictAccess -AppId c33aa7**-****-****-****-******ffca0d -PolicyScopeGroupId usera@emaildomain.com -Description \"restrict mailtest access\"\u003c/pre\u003e\n\nTo test the policy against the restricted account or/and other account, the AccessCheckResult returns if the effective access is denied or granted.\n\u003cpre class\u003d\"brush:ps\"\u003eTest-ApplicationAccessPolicy -AppId c33aa7**-****-****-****-******ffca0d -Identity usera@emaildomain.com\n\u003c/pre\u003e\nFrom the experience, even thought the test command responded the testing result of the policy, in practical the policy could take more than 1 hour to be effective. From personal experience, it takes about an hour.\n\u003cbr /\u003e\u003cbr /\u003e\nTo list all existing application access policy,\n\u003cpre class\u003d\"brush:ps\"\u003eGet-ApplicationAccessPolicy\n\u003c/pre\u003e\nPlease note that the Application Access Policy only applicable to certain Microsoft Graph application permission. As of the time of writing,\n\u003cbr /\u003e\u003cbr /\u003e\nMail.Read\u003cbr /\u003e\nMail.ReadBasic\u003cbr /\u003e\nMail.ReadBasic.All\u003cbr /\u003e\nMail.ReadWrite\u003cbr /\u003e\nMail.Send\u003cbr /\u003e\nMailboxSettings.Read\u003cbr /\u003e\nMailboxSettings.ReadWrite\u003cbr /\u003e\nCalendars.Read\u003cbr /\u003e\nCalendars.ReadWrite\u003cbr /\u003e\nContacts.Read\u003cbr /\u003e\nContacts.ReadWrite\u003cbr /\u003e\n\u003cbr /\u003eExchange Web Services permission scope: full_access_as_app.\u003cbr /\u003e\u003cbr /\u003e\nMore information can be found \u003ca href\u003d\"https://learn.microsoft.com/en-us/graph/auth-limit-mailbox-access\" target\u003d\"_blank\"\u003ehere\u003c/a\u003e."},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/5778135876911825064/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2022/11/access-office-365-exchange-online.html#comment-form","title":"0 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/5778135876911825064"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/5778135876911825064"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2022/11/access-office-365-exchange-online.html","title":"Access Office 365 Exchange Online Mailbox with Client Credential Flow"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"media$thumbnail":{"xmlns$media":"http://search.yahoo.com/mrss/","url":"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgY4u6bDrWYRgvIDY0NHB1U7fOSbiFsxmU1DqWnBjHU2WT9vgHvQtHvpp6ZIM6Rj3Mq5TOwmsgs4TbY61znomd51bMn0-P3h_oGRCgd-MlNJJjyeuCZdaTlDEHeKe437LiQtDW0k_WUvlAr79E-z5cbAuiddxljc-7JDykJ1oq0h5XfXdcOtzFNrS9HsA/s72-c/mail-oauth0.jpg","height":"72","width":"72"},"thr$total":{"$t":"0"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-6432416055922322913"},"published":{"$t":"2022-08-17T07:00:00.007-05:00"},"updated":{"$t":"2023-01-12T22:29:07.631-06:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"PowerShell"},{"scheme":"http://www.blogger.com/atom/ns#","term":"Github"}],"title":{"type":"text","$t":"GitHub Create Private Fork of a Public Repository"},"content":{"type":"html","$t":"\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003ca href\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgmPyVgHciGcaKCXtOsgMRJfdkrGF1cMfWVfU0MFBV8EVsVCghHbY6QtoSPkA03Jr5GBuWQGrazUC3d9pMxwYUDwIQrd7H_MV1RCaT9DMv6hph1j_rTfPLqZwxaNNj7NucQeGP6Lr9MGU1guGd8CY0Bn802vDuvy66OHpCNKonxPLCPlv84YPypoW0kFw/s1600/clone.jpg\" style\u003d\"clear: left; float: left; margin-bottom: 1em; margin-right: 1em;\"\u003e\u003cimg alt\u003d\"\" border\u003d\"0\" data-original-height\u003d\"224\" data-original-width\u003d\"224\" height\u003d\"100\" src\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgmPyVgHciGcaKCXtOsgMRJfdkrGF1cMfWVfU0MFBV8EVsVCghHbY6QtoSPkA03Jr5GBuWQGrazUC3d9pMxwYUDwIQrd7H_MV1RCaT9DMv6hph1j_rTfPLqZwxaNNj7NucQeGP6Lr9MGU1guGd8CY0Bn802vDuvy66OHpCNKonxPLCPlv84YPypoW0kFw/w200-h200/clone.jpg\" width\u003d\"200\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\nWhen making a fork of a GitHub public repository, the fork repository is set up with public visibility. However, under certain circumstances, it may be desirable to make the fork repository with private or internal (in an organization) visibility. \n\u003cbr /\u003e\u003cbr /\u003e\nHere is one way to indirectly 'forking' a repository with with private or internal visibility.  \n\u003cbr /\u003e\u003cbr /\u003e\u003cbr /\u003e\n\u003cspan\u003e\u003ca name\u003d'more'\u003e\u003c/a\u003e\u003c/span\u003e\nFirst creates a new repo (private-repo.git) in Github through the ui or cli.\n\u003cbr /\u003e\u003cbr /\u003e\nThen, clone a public repo to local. \u003cbr /\u003e\n--bare option is use to make a copy all branches and tags without mapping them to the remotes origin\n\u003cbr /\u003e\u003cbr /\u003e\nThen, push the local to the private repo\u003cbr /\u003e\n--mirror option is use to push all refs\n\u003cbr /\u003e\u003cbr /\u003e\nRemove the local copy of public repo .git directory.\n\n\u003cpre class\u003d\"brush:ps\"\u003egit clone --bare https://github.com/original_user/public-repo.git\n\ncd public-repo.git\ngit push --mirror https://github.com/your_user/private-repo.git\n\ncd ..\nrm -rf public-repo.git\n\u003c/pre\u003e\n\nTo perform any changes, \u003cbr /\u003e\u003cbr /\u003e\nFirst clone the private copy to local.\u003cbr /\u003e\nMake changes and push the changes up to private repo.\n\n\u003cpre class\u003d\"brush:ps\"\u003egit clone https://github.com/your_user/private-repo.git\ncd private-repo\n\ngit commit\ngit push origin master\n\u003c/pre\u003e\n\nFor any subsequent changes on the private repo, pull it down to local\n\n\u003cpre class\u003d\"brush:ps\"\u003egit pull origin # or git pull origin branch\n\n# or using fetch/merge\n\ngit fetch origin\ngit merge origin/master # or intended remote branch\n\u003c/pre\u003e\n\nTo update from public repo to private repo. \u003cbr /\u003e\nFirst add the public repo to remote. Now there should be two remotes.\u003cbr /\u003e\nOne for origin pointing to the private repo (your_user), and one as public pointing to public repo (original_user)\n\u003cbr /\u003e\u003cbr /\u003e\nThen pull the latest codes from public to local\u003cbr /\u003e\nThen push to the private repo\n\u003cbr /\u003e\u003cbr /\u003e\nTo also update all the tags from the public repo to private repo\u003cbr /\u003e\nfetch the public repo tags\u003cbr /\u003e\npush the tags to private repo\n\n\u003cpre class\u003d\"brush:ps\"\u003ecd private-repo\ngit remote add public https://github.com/original_user/public-repo.git\n\n# check list of remote\ngit remove -v\n\n# Creates a merge commit\ngit pull public master \n\ngit push origin master\n\ngit fetch public --tags\ngit push origin --tags\n\n# if there is a need to remove the local tag to let public tag take precendent\ngit tag -d \u0026lt;tag\u0026gt;\n\n# if there is a need to remove the remote tag\ngit push origin --delete \u0026lt;tag\u0026gt;\n\u003c/pre\u003e"},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/6432416055922322913/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2022/08/github-create-private-fork-of-public.html#comment-form","title":"0 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/6432416055922322913"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/6432416055922322913"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2022/08/github-create-private-fork-of-public.html","title":"GitHub Create Private Fork of a Public Repository"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"media$thumbnail":{"xmlns$media":"http://search.yahoo.com/mrss/","url":"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgmPyVgHciGcaKCXtOsgMRJfdkrGF1cMfWVfU0MFBV8EVsVCghHbY6QtoSPkA03Jr5GBuWQGrazUC3d9pMxwYUDwIQrd7H_MV1RCaT9DMv6hph1j_rTfPLqZwxaNNj7NucQeGP6Lr9MGU1guGd8CY0Bn802vDuvy66OHpCNKonxPLCPlv84YPypoW0kFw/s72-w200-c-h200/clone.jpg","height":"72","width":"72"},"thr$total":{"$t":"0"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-305053587123761006"},"published":{"$t":"2021-04-08T23:22:00.010-05:00"},"updated":{"$t":"2021-04-09T12:56:12.586-05:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"PowerShell"},{"scheme":"http://www.blogger.com/atom/ns#","term":"Azure"}],"title":{"type":"text","$t":"Create Sharepoint Site with Azure Function using PowerShell"},"content":{"type":"html","$t":"\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003ca href\u003d\"https://lh3.googleusercontent.com/-AgGpnxIsvio/YHCRGAvK2-I/AAAAAAAAgT8/-fkKUtss3CY4ZRV_AkiwGj5d_X5aLv-xgCLcBGAsYHQ/image.png\" style\u003d\"clear: left; float: left; margin-bottom: 1em; margin-right: 1em;\"\u003e\u003cimg alt\u003d\"\" data-original-height\u003d\"634\" data-original-width\u003d\"695\" height\u003d\"183\" src\u003d\"https://lh3.googleusercontent.com/-AgGpnxIsvio/YHCRGAvK2-I/AAAAAAAAgT8/-fkKUtss3CY4ZRV_AkiwGj5d_X5aLv-xgCLcBGAsYHQ/w200-h183/image.png\" width\u003d\"200\" /\u003e\u003c/a\u003e\u003c/div\u003e\u003cdiv\u003e\u003cb\u003eBusiness Case\u003c/b\u003e\u003c/div\u003e\u003cdiv\u003eCreate SharePoint site based on user requirement without SharePoint Admin manually performing the task.\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u003cb\u003eDesign\u003c/b\u003e\u003cbr /\u003e\u003cdiv\u003eA SharePoint List is created and shared with designated users to enter the desired SharePoint site criteria. Using an Azure function with PnP PowerShell module (compatible with PowerShell core in Azure function) to periodically check if any new entry in the SharePoint List and create SharePoint site based on the user entered criteria.\u003c/div\u003e\u003cdiv\u003e\u003cb\u003e\u003cbr /\u003e\u003c/b\u003e\u003c/div\u003e\u003cdiv\u003e\u003cb\u003e\u003cbr /\u003e\u003c/b\u003e\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u003cspan\u003e\u003ca name\u003d'more'\u003e\u003c/a\u003e\u003c/span\u003e\u003c/div\u003e\u003cdiv\u003e\u003cb\u003eCreate a SharePoint List\u003c/b\u003e\u003c/div\u003e\u003c/div\u003e\u003cdiv\u003eCreate a new SharePoint List. In this example, it is named as NewSiteList.\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003ca href\u003d\"https://lh3.googleusercontent.com/-NIEaqo3y3Uw/YG_MUyIyC3I/AAAAAAAAgQs/7X1JLq2bO94wz_9zExrbCuI6hIjPp2E9ACLcBGAsYHQ/image.png\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg alt\u003d\"\" data-original-height\u003d\"555\" data-original-width\u003d\"380\" height\u003d\"240\" src\u003d\"https://lh3.googleusercontent.com/-NIEaqo3y3Uw/YG_MUyIyC3I/AAAAAAAAgQs/7X1JLq2bO94wz_9zExrbCuI6hIjPp2E9ACLcBGAsYHQ/image.png\" width\u003d\"164\" /\u003e\u003c/a\u003e\u003c/div\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003eCreate necessary fields/columns to capture new SharePoint site criteria.\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003ca href\u003d\"https://lh3.googleusercontent.com/-BHELMLCPIR8/YG_MqWr79xI/AAAAAAAAgQ0/V8wI6cdsy_4twcFtMJli2_FiDo41XCMAwCLcBGAsYHQ/image.png\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg alt\u003d\"\" data-original-height\u003d\"314\" data-original-width\u003d\"1656\" height\u003d\"122\" src\u003d\"https://lh3.googleusercontent.com/-BHELMLCPIR8/YG_MqWr79xI/AAAAAAAAgQ0/V8wI6cdsy_4twcFtMJli2_FiDo41XCMAwCLcBGAsYHQ/w640-h122/image.png\" width\u003d\"640\" /\u003e\u003c/a\u003e\u003c/div\u003e\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u003cb\u003eCreate an account to access SharePoint List\u003c/b\u003e\u003c/div\u003eCreate an account in Azure AD which would be used by the Azure function to access the SharePoint List. Similar account is also used to create SharePoint site.\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003cb\u003eGrant account permission\u003c/b\u003e\u003c/div\u003e\u003cdiv\u003eGrant the account read (and write access in needed to update the list) access to the list.\u0026nbsp;\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003ca href\u003d\"https://lh3.googleusercontent.com/-tI0Oxc4Modc/YG_MygD9EGI/AAAAAAAAgQ8/FFzts0E2ZQE7s7VvSgf_Mqx0okGcTLL1QCLcBGAsYHQ/image.png\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg alt\u003d\"\" data-original-height\u003d\"505\" data-original-width\u003d\"517\" height\u003d\"240\" src\u003d\"https://lh3.googleusercontent.com/-tI0Oxc4Modc/YG_MygD9EGI/AAAAAAAAgQ8/FFzts0E2ZQE7s7VvSgf_Mqx0okGcTLL1QCLcBGAsYHQ/image.png\" width\u003d\"246\" /\u003e\u003c/a\u003e\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003eCheck the setting where user can create site or set site creation permission accordingly.\u003cbr /\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003ca href\u003d\"https://lh3.googleusercontent.com/-lcTdTI64WV4/YG_M61sqdnI/AAAAAAAAgRE/qrQ_mmIA3AcPtTyFxZ9lVGs7atdCD7IgACLcBGAsYHQ/image.png\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg alt\u003d\"\" data-original-height\u003d\"493\" data-original-width\u003d\"1603\" height\u003d\"196\" src\u003d\"https://lh3.googleusercontent.com/-lcTdTI64WV4/YG_M61sqdnI/AAAAAAAAgRE/qrQ_mmIA3AcPtTyFxZ9lVGs7atdCD7IgACLcBGAsYHQ/w640-h196/image.png\" width\u003d\"640\" /\u003e\u003c/a\u003e\u003c/div\u003e\u003cbr /\u003e\u003cb\u003eCreate Azure function\u003c/b\u003e\u003c/div\u003e\u003cdiv\u003eSelect code and PowerShell Core as runtime stack with PowerShell 7.0\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003ca href\u003d\"https://lh3.googleusercontent.com/-OdBidsST2Ow/YG_OCcnx3nI/AAAAAAAAgRU/yojrQfuwxc4N13cxmh22cS20Jsnh9Ct5wCLcBGAsYHQ/image.png\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg alt\u003d\"\" data-original-height\u003d\"703\" data-original-width\u003d\"1097\" height\u003d\"410\" src\u003d\"https://lh3.googleusercontent.com/-OdBidsST2Ow/YG_OCcnx3nI/AAAAAAAAgRU/yojrQfuwxc4N13cxmh22cS20Jsnh9Ct5wCLcBGAsYHQ/w640-h410/image.png\" width\u003d\"640\" /\u003e\u003c/a\u003e\u003c/div\u003e\u003cbr /\u003e\u003c/div\u003e\u003c/div\u003e\u003cdiv\u003e\u003cb\u003eStore user credential in Azure Key Vault\u003c/b\u003e\u003c/div\u003e\u003cdiv\u003eAzure Key Vault is used to store the credential safely. In order for Azure function to access the Secret in the Key vault, an identity is created for Azure function, and the identity is used for granting access permission in Key vault.\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003eUnder identity, on System assigned tab, create system managed identity by selecting On for the status. Copy the Object ID for later use.\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003ca href\u003d\"https://lh3.googleusercontent.com/-0djaZUT6LSo/YG_OQgq-k7I/AAAAAAAAgRY/Y5uv9_o-sQIkB63dkfnVRiQeNWmeVzOcgCLcBGAsYHQ/image.png\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg alt\u003d\"\" data-original-height\u003d\"704\" data-original-width\u003d\"1057\" height\u003d\"426\" src\u003d\"https://lh3.googleusercontent.com/-0djaZUT6LSo/YG_OQgq-k7I/AAAAAAAAgRY/Y5uv9_o-sQIkB63dkfnVRiQeNWmeVzOcgCLcBGAsYHQ/w640-h426/image.png\" width\u003d\"640\" /\u003e\u003c/a\u003e\u003c/div\u003e\u003cbr /\u003eCreate Key Vault.\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003ca href\u003d\"https://lh3.googleusercontent.com/--dZjl3EvAuw/YG_OmY8x4bI/AAAAAAAAgRk/crDGVOgygpMOfnoqzsObgkkrjv624Mz9ACLcBGAsYHQ/image.png\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg alt\u003d\"\" data-original-height\u003d\"495\" data-original-width\u003d\"762\" height\u003d\"416\" src\u003d\"https://lh3.googleusercontent.com/--dZjl3EvAuw/YG_OmY8x4bI/AAAAAAAAgRk/crDGVOgygpMOfnoqzsObgkkrjv624Mz9ACLcBGAsYHQ/w640-h416/image.png\" width\u003d\"640\" /\u003e\u003c/a\u003e\u003c/div\u003e\u003cbr /\u003eCreate a Secret each for user and password\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003ca href\u003d\"https://lh3.googleusercontent.com/-CNJVGRafUSo/YG_OtwvzsLI/AAAAAAAAgRo/m9h02aQFF3kSp-x0vnu0r49Tsd0VOUfjACLcBGAsYHQ/image.png\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg alt\u003d\"\" data-original-height\u003d\"542\" data-original-width\u003d\"916\" height\u003d\"378\" src\u003d\"https://lh3.googleusercontent.com/-CNJVGRafUSo/YG_OtwvzsLI/AAAAAAAAgRo/m9h02aQFF3kSp-x0vnu0r49Tsd0VOUfjACLcBGAsYHQ/w640-h378/image.png\" width\u003d\"640\" /\u003e\u003c/a\u003e\u003c/div\u003e\u003cbr /\u003eCopy the secret identifier for user and password\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003ca href\u003d\"https://lh3.googleusercontent.com/-3oX2MSS4Hik/YG_O1QMQHDI/AAAAAAAAgRs/Ppp4oZDiKEgUGR_um020BuoPU3EwhUEqACLcBGAsYHQ/image.png\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg alt\u003d\"\" data-original-height\u003d\"573\" data-original-width\u003d\"676\" height\u003d\"543\" src\u003d\"https://lh3.googleusercontent.com/-3oX2MSS4Hik/YG_O1QMQHDI/AAAAAAAAgRs/Ppp4oZDiKEgUGR_um020BuoPU3EwhUEqACLcBGAsYHQ/w640-h543/image.png\" width\u003d\"640\" /\u003e\u003c/a\u003e\u003c/div\u003e\u003cbr /\u003eGrant Get access to Secret for the Azure function system managed identity created earlier\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003ca href\u003d\"https://lh3.googleusercontent.com/-JduQemzRThg/YG_RPzAZDoI/AAAAAAAAgS4/GgiO29A7kgsu1KKbPjTp5ecDPcW8poXhQCLcBGAsYHQ/image.png\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg alt\u003d\"\" data-original-height\u003d\"835\" data-original-width\u003d\"552\" height\u003d\"400\" src\u003d\"https://lh3.googleusercontent.com/-JduQemzRThg/YG_RPzAZDoI/AAAAAAAAgS4/GgiO29A7kgsu1KKbPjTp5ecDPcW8poXhQCLcBGAsYHQ/w265-h400/image.png\" width\u003d\"265\" /\u003e\u003c/a\u003e\u003c/div\u003e\u003c/div\u003e\u003cbr /\u003e\u003c/div\u003e\u003cb\u003eCreate new application settings each for the user and password in the Azure function.\u003c/b\u003e\u003c/div\u003e\u003cdiv\u003eBack to Azure function, under Configuration, create new application setting with a name and the secret identifier created previously.\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003eThe format of the value is\u0026nbsp;\u003c/div\u003e\u003cdiv\u003e@Microsoft.KeyVault(SecretUri\u003d\u0026lt;secret identifier\u0026gt;)\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003eMore information in this article,\u003c/div\u003e\u003cdiv\u003e\u003cdiv\u003ehttps://docs.microsoft.com/en-us/azure/app-service/app-service-key-vault-references#reference-syntax\u003c/div\u003e\u003c/div\u003e\u003cdiv\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003ca href\u003d\"https://lh3.googleusercontent.com/-n0Far7D0D7Y/YG_Ph-NaAHI/AAAAAAAAgSM/HjJw9wPZn4E1DTl-QMJi80VhYxDoH_V7ACLcBGAsYHQ/image.png\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg alt\u003d\"\" data-original-height\u003d\"399\" data-original-width\u003d\"1766\" height\u003d\"144\" src\u003d\"https://lh3.googleusercontent.com/-n0Far7D0D7Y/YG_Ph-NaAHI/AAAAAAAAgSM/HjJw9wPZn4E1DTl-QMJi80VhYxDoH_V7ACLcBGAsYHQ/w640-h144/image.png\" width\u003d\"640\" /\u003e\u003c/a\u003e\u003c/div\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u003cb\u003eConfigure PnP PowerShell Shell permission\u003c/b\u003e\u003c/div\u003e\u003cdiv\u003eAdmin permission is required to grant permission to PnP PowerShell shell to execute Microsoft Graph API on behalf of the user.\u0026nbsp;\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003eTo grant permission, execute the following command locally with an Azure AD admin account (with Application Admin role), it would prompt to grant admin permission the first time.\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003eConnect-PnPOnline -Url https://test.sharepoint.com/sites/test -Interactive\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003eAdmin consent can be also granted through Azure AD enterprise application.\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003ca href\u003d\"https://lh3.googleusercontent.com/-IQBbkshl_YA/YG_U1dtbOuI/AAAAAAAAgTY/2_C-zN-jaf4yfJSdt69t6rQ7OkyjjyuLQCLcBGAsYHQ/image.png\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg alt\u003d\"\" data-original-height\u003d\"677\" data-original-width\u003d\"1172\" height\u003d\"370\" src\u003d\"https://lh3.googleusercontent.com/-IQBbkshl_YA/YG_U1dtbOuI/AAAAAAAAgTY/2_C-zN-jaf4yfJSdt69t6rQ7OkyjjyuLQCLcBGAsYHQ/w640-h370/image.png\" width\u003d\"640\" /\u003e\u003c/a\u003e\u003c/div\u003e\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u003cb\u003eSetup PnP PowerShell Module in Azure function\u003c/b\u003e\u003c/div\u003e\u003cdiv\u003eOption 1, use dependency requirement specified in the requirement.psd1 file\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003ca href\u003d\"https://lh3.googleusercontent.com/-ypH-JcdIGeY/YG_QP6UvelI/AAAAAAAAgSU/uxN6NzEMB3ED0IBaNU1b44vCt7v5no-kQCLcBGAsYHQ/image.png\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg alt\u003d\"\" data-original-height\u003d\"561\" data-original-width\u003d\"1538\" height\u003d\"234\" src\u003d\"https://lh3.googleusercontent.com/-ypH-JcdIGeY/YG_QP6UvelI/AAAAAAAAgSU/uxN6NzEMB3ED0IBaNU1b44vCt7v5no-kQCLcBGAsYHQ/w640-h234/image.png\" width\u003d\"640\" /\u003e\u003c/a\u003e\u003c/div\u003e\u003cbr /\u003eOption 2, via Kudu, create a Modules folder under site/wwwroot. Copy (drag and drop) the PnP.PowerShell module (downloaded to PC) to the Modules folder.\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003ca href\u003d\"https://lh3.googleusercontent.com/-AFah3H2tm8k/YG_QUTfDJhI/AAAAAAAAgSY/so8eOu_MCSkBMM79oOjWjwXmU7O4KYeJACLcBGAsYHQ/image.png\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg alt\u003d\"\" data-original-height\u003d\"300\" data-original-width\u003d\"728\" height\u003d\"264\" src\u003d\"https://lh3.googleusercontent.com/-AFah3H2tm8k/YG_QUTfDJhI/AAAAAAAAgSY/so8eOu_MCSkBMM79oOjWjwXmU7O4KYeJACLcBGAsYHQ/w640-h264/image.png\" width\u003d\"640\" /\u003e\u003c/a\u003e\u003c/div\u003e\u003cbr /\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003ca href\u003d\"https://lh3.googleusercontent.com/-sNjc3oxMEwI/YG_Q1P0RMLI/AAAAAAAAgSs/qvM4lhZ5yKc5Cgo6gT7W8v0BeBKy4brcACLcBGAsYHQ/image.png\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg alt\u003d\"\" data-original-height\u003d\"850\" data-original-width\u003d\"820\" height\u003d\"640\" src\u003d\"https://lh3.googleusercontent.com/-sNjc3oxMEwI/YG_Q1P0RMLI/AAAAAAAAgSs/qvM4lhZ5yKc5Cgo6gT7W8v0BeBKy4brcACLcBGAsYHQ/w619-h640/image.png\" width\u003d\"619\" /\u003e\u003c/a\u003e\u003c/div\u003e\u003cbr /\u003e\u003c/div\u003e\u003c/div\u003e\u003cdiv\u003e\u003cb\u003eAzure function code\u003c/b\u003e\u003c/div\u003e\u003cdiv\u003eAdd new function. Select Time Trigger to periodically trigger every 5 minutes.\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003ca href\u003d\"https://lh3.googleusercontent.com/-IcSGCIQMuqo/YG_Rf-9M2hI/AAAAAAAAgTA/pBNF4ua8xSsgnczmoAm1jXyuTMB8f0UnACLcBGAsYHQ/image.png\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg alt\u003d\"\" data-original-height\u003d\"646\" data-original-width\u003d\"1677\" height\u003d\"246\" src\u003d\"https://lh3.googleusercontent.com/-IcSGCIQMuqo/YG_Rf-9M2hI/AAAAAAAAgTA/pBNF4ua8xSsgnczmoAm1jXyuTMB8f0UnACLcBGAsYHQ/w640-h246/image.png\" width\u003d\"640\" /\u003e\u003c/a\u003e\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003eAzure function code\u003cbr /\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003ca href\u003d\"https://lh3.googleusercontent.com/-hHKGpkFNRN0/YG_SLBCLiQI/AAAAAAAAgTI/kZoys9dvF1QlHWKmSOsgEft2mTmWfmsHgCLcBGAsYHQ/image.png\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg alt\u003d\"\" data-original-height\u003d\"691\" data-original-width\u003d\"1336\" height\u003d\"332\" src\u003d\"https://lh3.googleusercontent.com/-hHKGpkFNRN0/YG_SLBCLiQI/AAAAAAAAgTI/kZoys9dvF1QlHWKmSOsgEft2mTmWfmsHgCLcBGAsYHQ/w640-h332/image.png\" width\u003d\"640\" /\u003e\u003c/a\u003e\u003c/div\u003e\u003cbr /\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u003cdiv\u003e##If using the option 2, import the module with the path\u003c/div\u003e\u003cdiv\u003eImport-Module -Name C:\\home\\site\\wwwroot\\Modules\\PnP.PowerShell\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e##Retrieve user credential from application setting where the source is from the secrets stored in key vault.\u003c/div\u003e\u003cdiv\u003e$user \u003d $env:spfunctionuser\u003c/div\u003e\u003cdiv\u003e$pw \u003d $env:spfunctionpw | ConvertTo-SecureString -AsPlainText\u003c/div\u003e\u003cdiv\u003e$cred \u003d New-Object System.Management.Automation.PSCredential($user, $pw)\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e##Connect to SharePoint site where the SharePoint list is hosted\u0026nbsp;\u003c/div\u003e\u003cdiv\u003eConnect-PnPOnline -Url https://test.sharepoint.com/sites/test -Credentials $cred -ErrorAction Stop\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e##Only get the entries where SiteCreated is 'No' using CAML filter query\u003c/div\u003e\u003cdiv\u003e$sites \u003d Get-PnPListItem -List lists/newsitelist -Query \"\u0026lt;View\u0026gt;\u0026lt;Query\u0026gt;\u0026lt;Where\u0026gt;\u0026lt;Eq\u0026gt;\u0026lt;FieldRef Name\u003d'SiteCreated'/\u0026gt;\u0026lt;Value Type\u003d'Integer'\u0026gt;0\u0026lt;/Value\u0026gt;\u0026lt;/Eq\u0026gt;\u0026lt;/Where\u0026gt;\u0026lt;/Query\u0026gt;\u0026lt;/View\u0026gt;\"\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e##Loop through the entries and create the site based on the criteria provided, and update the SiteCreated field of the entry.\u003c/div\u003e\u003cdiv\u003eif ($sites) {\u003c/div\u003e\u003cdiv\u003e\u0026nbsp; \u0026nbsp; $sites | ForEach-Object {\u0026nbsp;\u003c/div\u003e\u003cdiv\u003e\u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; New-PnPSite -Title $_.FieldValues.Title `\u003c/div\u003e\u003cdiv\u003e\u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; -Type $_.FieldValues.SiteType `\u003c/div\u003e\u003cdiv\u003e\u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; -Url $(\"https://test.sharepoint.com/sites/$($_.FieldValues.Title)\") `\u003c/div\u003e\u003cdiv\u003e\u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; -ShareByEmailEnabled:$_.FieldValues.ShareByEmailEnabled `\u003c/div\u003e\u003cdiv\u003e\u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; -SiteDesign $_.FieldValues.SiteDesign `\u003c/div\u003e\u003cdiv\u003e\u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; -Owner $_.FieldValues.SiteOwner.Email\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; if ($?) {\u003c/div\u003e\u003cdiv\u003e\u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; Write-Output \"site created\"\u003c/div\u003e\u003cdiv\u003e\u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; Set-PnPListItem -List lists/newsitelist -Identity $_.Id -Values @{\"SiteCreated\" \u003d 1}\u003c/div\u003e\u003cdiv\u003e\u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; }\u003c/div\u003e\u003cdiv\u003e\u0026nbsp; \u0026nbsp; }\u003c/div\u003e\u003cdiv\u003e}\u003c/div\u003e\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003eOnce the Azure function schedule trigger the execution, new SharePoint Site is created accordingly. Success!\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003e\u003cdiv\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\u003ca href\u003d\"https://lh3.googleusercontent.com/-ILUcfOjnrio/YG_STl1kYhI/AAAAAAAAgTM/nRGYz7AdOdAk1jOiwPVQZgDxfbpAgFfrACLcBGAsYHQ/image.png\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg alt\u003d\"\" data-original-height\u003d\"419\" data-original-width\u003d\"1191\" height\u003d\"226\" src\u003d\"https://lh3.googleusercontent.com/-ILUcfOjnrio/YG_STl1kYhI/AAAAAAAAgTM/nRGYz7AdOdAk1jOiwPVQZgDxfbpAgFfrACLcBGAsYHQ/w640-h226/image.png\" width\u003d\"640\" /\u003e\u003c/a\u003e\u003c/div\u003e\u003cdiv\u003e\u003cbr /\u003e\u003c/div\u003eInstead of executing Azure function with a schedule, potential improvement is to leverage SharePoint webhooks to notify, for example via Azure queue to invoke Azure function when there is new entry in the SharePoint List.\u003c/div\u003e\u003c/div\u003e"},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/305053587123761006/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2021/04/create-sharepoint-site-with-azure.html#comment-form","title":"0 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/305053587123761006"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/305053587123761006"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2021/04/create-sharepoint-site-with-azure.html","title":"Create Sharepoint Site with Azure Function using PowerShell"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"media$thumbnail":{"xmlns$media":"http://search.yahoo.com/mrss/","url":"https://lh3.googleusercontent.com/-AgGpnxIsvio/YHCRGAvK2-I/AAAAAAAAgT8/-fkKUtss3CY4ZRV_AkiwGj5d_X5aLv-xgCLcBGAsYHQ/s72-w200-c-h183/image.png","height":"72","width":"72"},"thr$total":{"$t":"0"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-7507163608548443699"},"published":{"$t":"2019-04-25T07:00:00.000-05:00"},"updated":{"$t":"2019-05-07T21:47:36.012-05:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"Troubleshooting"}],"title":{"type":"text","$t":"Network Traffic Capture Without Installing Any Software"},"content":{"type":"html","$t":"\u003ca href\u003d\"https://3.bp.blogspot.com/-gU1WsvQukjc/XMEpy8B8QpI/AAAAAAAAVU0/Ikw9hJwPmPk9dASTDWPI_QrCzegT4T_zACLcBGAs/s1600/netmon.png\" imageanchor\u003d\"1\"\u003e\u003cimg border\u003d\"0\" data-original-height\u003d\"173\" data-original-width\u003d\"589\" src\u003d\"https://3.bp.blogspot.com/-gU1WsvQukjc/XMEpy8B8QpI/AAAAAAAAVU0/Ikw9hJwPmPk9dASTDWPI_QrCzegT4T_zACLcBGAs/s1600/netmon.png\" /\u003e\u003c/a\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nFrom time to time, there is a need to capture network traffic for troubleshooting on server. Some network tool like Wireshark is pretty popular for network capturing. However, these tool often need additional installation on the server, and depending on your security team rules, it may take days to be installed on the server.\u003cbr /\u003e\n\u003cbr /\u003e\nThere is an alternate way to capture network traffic on Windows OS without additional software installation. Using Netsh trace. Most of you may familiar with Netsh for different type of common purpose like firewall, http listener, network interface info, etc. Netsh could be also used to collect network trace.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca name\u003d'more'\u003e\u003c/a\u003eOpen the elevated command prompt / powershell,\u003cbr /\u003e\n\u003cbr /\u003e\nnetsh trace -?\u003cbr /\u003e\n\u003cbr /\u003e\nIt shows the list of parameter purpose, examples and other useful information on netsh trace\u003cbr /\u003e\n\u003cbr /\u003e\nTo simply start a trace,\u003cbr /\u003e\nnetsh trace start capture\u003dyes tracefile\u003dc:\\nettrace-example.etl\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca href\u003d\"https://1.bp.blogspot.com/-aA3ZzXSWKEI/XMEkYoYIw_I/AAAAAAAAVTw/ffKfanKaQIYQxYmT5_4ANPyMscngSj7NQCLcBGAs/s1600/netsh-trace-start.png\" imageanchor\u003d\"1\"\u003e\u003cimg border\u003d\"0\" data-original-height\u003d\"175\" data-original-width\u003d\"569\" height\u003d\"122\" src\u003d\"https://1.bp.blogspot.com/-aA3ZzXSWKEI/XMEkYoYIw_I/AAAAAAAAVTw/ffKfanKaQIYQxYmT5_4ANPyMscngSj7NQCLcBGAs/s400/netsh-trace-start.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nTo stop the trace,\u003cbr /\u003e\nnetsh trace stop\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca href\u003d\"https://2.bp.blogspot.com/-84xFc0Wk2mE/XMEkwIF1qHI/AAAAAAAAVT4/In6aWd5vUZAWJSqnrWf3xUuaWHgn5l0HgCLcBGAs/s1600/netsh-trace-stop.png\" imageanchor\u003d\"1\"\u003e\u003cimg border\u003d\"0\" data-original-height\u003d\"121\" data-original-width\u003d\"766\" height\u003d\"100\" src\u003d\"https://2.bp.blogspot.com/-84xFc0Wk2mE/XMEkwIF1qHI/AAAAAAAAVT4/In6aWd5vUZAWJSqnrWf3xUuaWHgn5l0HgCLcBGAs/s640/netsh-trace-stop.png\" width\u003d\"640\" /\u003e\u003c/a\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nThese are the trace files generated.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca href\u003d\"https://4.bp.blogspot.com/-7XsPrNIzWGw/XMElGIa23jI/AAAAAAAAVUA/DXKrXCQumd4RFkTijlIPM507HFz4nNPRgCLcBGAs/s1600/netsh-trace-file.png\" imageanchor\u003d\"1\"\u003e\u003cimg border\u003d\"0\" data-original-height\u003d\"131\" data-original-width\u003d\"240\" height\u003d\"108\" src\u003d\"https://4.bp.blogspot.com/-7XsPrNIzWGw/XMElGIa23jI/AAAAAAAAVUA/DXKrXCQumd4RFkTijlIPM507HFz4nNPRgCLcBGAs/s200/netsh-trace-file.png\" width\u003d\"200\" /\u003e\u003c/a\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nYou can copy the trace file to a computer that has netmon (network monitor) installed. You may download the software \u003ca href\u003d\"https://www.microsoft.com/en-us/download/details.aspx?id\u003d4865\" target\u003d\"_blank\"\u003ehere\u003c/a\u003e. Presumably you can at least install the software easily on your computer.\u003cbr /\u003e\n\u003cbr /\u003e\nOpen netmon and read the trace file.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca href\u003d\"https://2.bp.blogspot.com/-nhUUH4JPgHk/XMElV-NGmdI/AAAAAAAAVUE/bGQVDfuBJj8OjWF4aeheDL8FmJfo0umMQCLcBGAs/s1600/netmon-open-trace.png\" imageanchor\u003d\"1\"\u003e\u003cimg border\u003d\"0\" data-original-height\u003d\"173\" data-original-width\u003d\"395\" height\u003d\"140\" src\u003d\"https://2.bp.blogspot.com/-nhUUH4JPgHk/XMElV-NGmdI/AAAAAAAAVUE/bGQVDfuBJj8OjWF4aeheDL8FmJfo0umMQCLcBGAs/s320/netmon-open-trace.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nYou may see the parser issue on the description. \n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca href\u003d\"https://3.bp.blogspot.com/-Dozgrq8lX9o/XMElqjeZcfI/AAAAAAAAVUU/xGBBLJkfInUcLcUctIXbGKgJPsRJbb2IACLcBGAs/s1600/netmon-parser-issue.png\" imageanchor\u003d\"1\"\u003e\u003cimg border\u003d\"0\" data-original-height\u003d\"430\" data-original-width\u003d\"1419\" height\u003d\"193\" src\u003d\"https://3.bp.blogspot.com/-Dozgrq8lX9o/XMElqjeZcfI/AAAAAAAAVUU/xGBBLJkfInUcLcUctIXbGKgJPsRJbb2IACLcBGAs/s640/netmon-parser-issue.png\" width\u003d\"640\" /\u003e\u003c/a\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nFor those who paid close attention during the netmon installation, it also prompted to install the parser. However, the parser is not configured to be active by default. To configure the parser, go to tools\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca href\u003d\"https://3.bp.blogspot.com/-HFO8Mm7jGsQ/XMEl3UzuXsI/AAAAAAAAVUY/KXss1EGt3-ITTfEnTzKPaHt8DBxZt9ONQCLcBGAs/s1600/netmon-tool.png\" imageanchor\u003d\"1\"\u003e\u003cimg border\u003d\"0\" data-original-height\u003d\"145\" data-original-width\u003d\"406\" height\u003d\"114\" src\u003d\"https://3.bp.blogspot.com/-HFO8Mm7jGsQ/XMEl3UzuXsI/AAAAAAAAVUY/KXss1EGt3-ITTfEnTzKPaHt8DBxZt9ONQCLcBGAs/s320/netmon-tool.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nSelect Windows, and click Set As Active\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca href\u003d\"https://3.bp.blogspot.com/-dGpNkAZpi9c/XMEl8Kz9yBI/AAAAAAAAVUc/frZpsKxp0m47YFSeKqC2ANJjA2U7CVpigCLcBGAs/s1600/netsh-parser.png\" imageanchor\u003d\"1\"\u003e\u003cimg border\u003d\"0\" data-original-height\u003d\"504\" data-original-width\u003d\"446\" height\u003d\"400\" src\u003d\"https://3.bp.blogspot.com/-dGpNkAZpi9c/XMEl8Kz9yBI/AAAAAAAAVUc/frZpsKxp0m47YFSeKqC2ANJjA2U7CVpigCLcBGAs/s400/netsh-parser.png\" width\u003d\"353\" /\u003e\u003c/a\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nNow the description is more useful and ready for troubleshooting and analysis.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca href\u003d\"https://2.bp.blogspot.com/-eY4kJj3w0xs/XMEmBQipskI/AAAAAAAAVUg/RGKQGibJJykoukM1sYi58QkFNf5AMDs_QCLcBGAs/s1600/netmon-analysis.png\" imageanchor\u003d\"1\"\u003e\u003cimg border\u003d\"0\" data-original-height\u003d\"427\" data-original-width\u003d\"1046\" height\u003d\"260\" src\u003d\"https://2.bp.blogspot.com/-eY4kJj3w0xs/XMEmBQipskI/AAAAAAAAVUg/RGKQGibJJykoukM1sYi58QkFNf5AMDs_QCLcBGAs/s640/netmon-analysis.png\" width\u003d\"640\" /\u003e\u003c/a\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nThat's it! Happy troubleshooting. More information on netmon filter could be found \u003ca href\u003d\"https://social.technet.microsoft.com/wiki/contents/articles/1130.network-monitor-ipv4-filtering.aspx\" target\u003d\"_blank\"\u003ehere\u003c/a\u003e."},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/7507163608548443699/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2019/04/network-traffic-capture-without.html#comment-form","title":"0 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/7507163608548443699"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/7507163608548443699"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2019/04/network-traffic-capture-without.html","title":"Network Traffic Capture Without Installing Any Software"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"media$thumbnail":{"xmlns$media":"http://search.yahoo.com/mrss/","url":"https://3.bp.blogspot.com/-gU1WsvQukjc/XMEpy8B8QpI/AAAAAAAAVU0/Ikw9hJwPmPk9dASTDWPI_QrCzegT4T_zACLcBGAs/s72-c/netmon.png","height":"72","width":"72"},"thr$total":{"$t":"0"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-7460865691864655045"},"published":{"$t":"2019-02-11T23:04:00.000-06:00"},"updated":{"$t":"2019-02-14T21:19:01.736-06:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"Azure"}],"title":{"type":"text","$t":"Inserting and Hosting 551 Millions Records Cheaply - Case Study"},"content":{"type":"html","$t":"\u003ca href\u003d\"https://3.bp.blogspot.com/-fgwCWeUKcUc/XGOdoem-bxI/AAAAAAAAUiY/On4qkg0pKx4e_MFTQhjwcpuh5WbBK5PCACLcBGAs/s1600/azureoffering.JPG\" imageanchor\u003d\"1\"\u003e\u003cimg border\u003d\"0\" data-original-height\u003d\"350\" data-original-width\u003d\"732\" height\u003d\"306\" src\u003d\"https://3.bp.blogspot.com/-fgwCWeUKcUc/XGOdoem-bxI/AAAAAAAAUiY/On4qkg0pKx4e_MFTQhjwcpuh5WbBK5PCACLcBGAs/s640/azureoffering.JPG\" width\u003d\"640\" /\u003e\u003c/a\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nRecently I worked on a \u003ca href\u003d\"https://passwordcheck.travisgan.com/\" target\u003d\"_blank\"\u003epet project\u003c/a\u003e with cloud implementation. There was a dump file in text format with 551 million line of records which needs to be stored in a way where filtered results could be quickly returned from a query, and with a goal of low cost implementation.\u003cbr /\u003e\n\u003cbr /\u003e\nThe dataset is static. Large number of records. No relational requirement. The storage and querying has to be fast and low cost. Azure table storage seems to be a good candidate for this purpose. Azure table storage is a NoSQL key value store and is very cheap.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca name\u003d'more'\u003e\u003c/a\u003eThe dataset (with some Azure table metadata) is estimated to be around 50 GB when it is stored in Azure table storage. With the choice of locally redundant storage (LRS), the price of the dataset storage is ~50 GB * $0.07 / GB per month \u003d ~$3.50 per month. For the data transfer calculation, Microsoft charges $0.00036 / 10,000 transaction for table (that includes read, write, delete). If it was to insert 551 million record transactions, it would cost $19.84. But we could better. More on that later.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca href\u003d\"https://1.bp.blogspot.com/-mpJIamNILtI/XGOUQedXgTI/AAAAAAAAUho/j_3d-FhPjicsJS5MeW1EOV7JaRrBVAcjQCLcBGAs/s1600/azuretablepricing.JPG\" imageanchor\u003d\"1\"\u003e\u003cimg border\u003d\"0\" data-original-height\u003d\"314\" data-original-width\u003d\"1090\" height\u003d\"184\" src\u003d\"https://1.bp.blogspot.com/-mpJIamNILtI/XGOUQedXgTI/AAAAAAAAUho/j_3d-FhPjicsJS5MeW1EOV7JaRrBVAcjQCLcBGAs/s640/azuretablepricing.JPG\" width\u003d\"640\" /\u003e\u003c/a\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nAlthough the application and API used against the Azure table would be made public, this pet project is more an educational project for myself so I don't expect large traffic (way less than 10,000). The operating transaction cost would be minimal.\u003cbr /\u003e\n\u003cbr /\u003e\nBack to the text file. It contains 551 million records. To increase the insert performance into Azure table storage, here are the implemented strategy.\u003cbr /\u003e\n\u003cbr /\u003e\n- Provisioned an Azure VM in the same region. Network latency and throughput is much better compare with personal computer.\u003cbr /\u003e\n- Batched 100 records (max limit) in a transaction whenever possible.\u003cbr /\u003e\n- Built some logic to insert records parallelly. 24 consoles were used to process different records simultaneously.\u003cbr /\u003e\n- Turned Nagle off to increase throughput for table insert.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eAzure VM\u003c/b\u003e\u003cbr /\u003e\nThe process is mostly compute intensive. Fsv2, Fs, F Azure VM type are the compute optimized VM. In this case, Fsv2 (2 vcpu, 4 GB RAM, 4 data disks, 4000 max iops, $75.89 / month) and standard HDD disk was selected.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca href\u003d\"https://2.bp.blogspot.com/-ZzutGnGlK_E/XGOUrzzPSDI/AAAAAAAAUhw/a5AnoIY8s5ch-ziQy0oC5aKqe4HCCjg7wCLcBGAs/s1600/vmsize.JPG\" imageanchor\u003d\"1\"\u003e\u003cimg border\u003d\"0\" data-original-height\u003d\"364\" data-original-width\u003d\"787\" height\u003d\"296\" src\u003d\"https://2.bp.blogspot.com/-ZzutGnGlK_E/XGOUrzzPSDI/AAAAAAAAUhw/a5AnoIY8s5ch-ziQy0oC5aKqe4HCCjg7wCLcBGAs/s640/vmsize.JPG\" width\u003d\"640\" /\u003e\u003c/a\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eMultiple records in batch\u003c/b\u003e\u003cbr /\u003e\nAzure table storage has a maximum of 100 records limit in a batch operation. All records in the batch must have same partition key. Batching not only increase insert performance, it also save cost. One batch operation is considered as a transaction. If all batches are 100 records, 551 millions records insert cost would be only $0.1984! In this case, due to the parallel logic put in place, average about 5 out of 6 batches contain 100 records.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eParallel Logic\u003c/b\u003e\u003cbr /\u003e\nSingle process would be boring and inefficient. The process do need to read the text file line by line and insert into Azure table storage. It doesn't seem like there is any native parallel process of reading file and performing the insert. Especially consider the file is around 25 GB in size and the VM has only 4 GB of memory. I thought of a logic where it could have multiple powershell process its own unique section line by line simultaneously with minimum memory required. I picked 24, just an arbitrary number but with previous experience. It does consume the 2 vcpu on VM around 90% the entire time. Good.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca href\u003d\"https://3.bp.blogspot.com/-lDxeFpiVAIM/XGOU3biH2RI/AAAAAAAAUh0/4fknlw3I8UAdFQmuMK8iul8zQngSNcu9gCLcBGAs/s1600/parallel.JPG\" imageanchor\u003d\"1\"\u003e\u003cimg border\u003d\"0\" data-original-height\u003d\"899\" data-original-width\u003d\"1595\" height\u003d\"361\" src\u003d\"https://3.bp.blogspot.com/-lDxeFpiVAIM/XGOU3biH2RI/AAAAAAAAUh0/4fknlw3I8UAdFQmuMK8iul8zQngSNcu9gCLcBGAs/s640/parallel.JPG\" width\u003d\"640\" /\u003e\u003c/a\u003e\n\u003ca href\u003d\"https://2.bp.blogspot.com/-AgNhPuD7CKY/XGOXH7V7zQI/AAAAAAAAUiM/Dz7BU-VOCroWBK3PfHi3quY5hBqQlCw3ACLcBGAs/s1600/cpu_resource.JPG\" imageanchor\u003d\"1\"\u003e\u003cimg border\u003d\"0\" data-original-height\u003d\"429\" data-original-width\u003d\"712\" height\u003d\"386\" src\u003d\"https://2.bp.blogspot.com/-AgNhPuD7CKY/XGOXH7V7zQI/AAAAAAAAUiM/Dz7BU-VOCroWBK3PfHi3quY5hBqQlCw3ACLcBGAs/s640/cpu_resource.JPG\" width\u003d\"640\" /\u003e\u003c/a\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eTurned off Nagle\u003c/b\u003e\u003cbr /\u003e\nMicrosoft storage team has an\u0026nbsp;\u003ca href\u003d\"https://blogs.msdn.microsoft.com/windowsazurestorage/2010/06/25/nagles-algorithm-is-not-friendly-towards-small-requests/\" target\u003d\"_blank\"\u003earticle\u003c/a\u003e of how the Nagle algorithm affects the Azure table storage insert performance. Personally has seen quite a significant improvement for Azure queue by turning off Nagle. [System.Net.ServicePointManager]::UseNagleAlgorithm \u003d $false\u003cbr /\u003e\n\u003cbr /\u003e\nOnce these are setup with a new Azure table created, I ran the powershell to process the records and insert into the Azure table. With 24 process running simultaneously, it inserts around 12k batch transaction / minute. Estimating ~90 records per batch transaction, that is average of 18,000 records insert / second.\u003cbr /\u003e\n\u003cbr /\u003e\nThe total times for inserting 551 millions records into Azure table took about 8 and 1/2 hours.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eSetup cost\u003c/b\u003e (including get server up and script ready),\u003cbr /\u003e\nAzure VM ~ $1.46\u003cbr /\u003e\nAzure table (storage + transaction) - $3.23\u003cbr /\u003e\nBandwidth - $0.04\u003cbr /\u003e\nTotal initial setup cost: $4.73\u003cbr /\u003e\n\u003cbr /\u003e\nVM is only needed for this initial setup process and is decommissioned right after.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eOperating cost \u003c/b\u003e(including all website components in addition to Azure table)\u003cbr /\u003e\nAzure table (storage) ~ $3\u003cbr /\u003e\nAzure table transaction (it would need to be \u0026gt; 270k transaction to be charged $0.01) - $0\u003cbr /\u003e\nAzure functions (as backend API, $0.20 / million execution, 1 million execution free / month) - $0\u003cbr /\u003e\nAzure static website (angular app size is small \u0026lt; 850 KB) ~ $0\u003cbr /\u003e\nOutbound Bandwidth (First 5 GB is free, between 5GB - 10TB, $0.087 / GB. Estimate application returns around 100 byte per each request) ~ $0\u003cbr /\u003e\nTotal monthly operating cost: ~$3\u003cbr /\u003e\n\u003cbr /\u003e\nPotential improvements. I could simplify the parallel logic to further increase the ratio of 100 records batches. Also increase the degree of parallelism (eg. 48 or more simultaneous process) along VM with more vpcu to further reduce the duration."},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/7460865691864655045/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2019/02/inserting-and-hosting-551-millions.html#comment-form","title":"0 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/7460865691864655045"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/7460865691864655045"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2019/02/inserting-and-hosting-551-millions.html","title":"Inserting and Hosting 551 Millions Records Cheaply - Case Study"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"media$thumbnail":{"xmlns$media":"http://search.yahoo.com/mrss/","url":"https://3.bp.blogspot.com/-fgwCWeUKcUc/XGOdoem-bxI/AAAAAAAAUiY/On4qkg0pKx4e_MFTQhjwcpuh5WbBK5PCACLcBGAs/s72-c/azureoffering.JPG","height":"72","width":"72"},"thr$total":{"$t":"0"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-4672753049553847595"},"published":{"$t":"2018-06-24T23:00:00.000-05:00"},"updated":{"$t":"2019-02-19T09:50:30.140-06:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"Linux"}],"title":{"type":"text","$t":"Windows Guy Learning Linux - Basic Command for Linux"},"content":{"type":"html","$t":"\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003cimg border\u003d\"0\" data-original-height\u003d\"400\" data-original-width\u003d\"200\" height\u003d\"160\" imageanchor\u003d\"1\" more\u003d\"\" src\u003d\"https://4.bp.blogspot.com/-jOOXd472bn4/WzJL9qKCS9I/AAAAAAAAPcU/qpXpQfe5bVcKvvQjyJWtX17KntchyCoAgCLcBGAs/s320/ilovelinux.jpg\" style\u003d\"clear: left; float: left; margin-bottom: 1em; margin-right: 1em;\" width\u003d\"200\" /\u003e\u003c/div\u003e\nIf you have been working mostly with Microsoft technology stacks, switching from Windows and learning Linux could be an exciting and daunting journey. So far my journey to Linux has been fun and I love it!\n\u003cbr /\u003e\n\u003cbr /\u003e\nLinux has GUI but mostly (almost entirely) for server and application administration are performed using command line. If you are on Windows workstation, \u003ca href\u003d\"https://www.putty.org/\" target\u003d\"_blank\"\u003ePuTTY\u003c/a\u003e software could be used to ssh into Linux server.\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca name\u003d'more'\u003e\u003c/a\u003eHere are some basic and commonly used commands. Hopefully the list is useful for you. Have fun learning!\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003e\u003cu\u003eTerminal\u003c/u\u003e\u003c/b\u003e\u003cbr /\u003e\n\u003cb\u003ectrl+c\u003c/b\u003e\u0026nbsp; stop the command.\u003cbr /\u003e\n\u003cb\u003ectrl+d\u003c/b\u003e\u0026nbsp; log out from current terminal (just like the command \u003cb\u003eexit\u003c/b\u003e)\u003cbr /\u003e\n\u003cb\u003ectrl+u\u003c/b\u003e\u0026nbsp; erase the current line. It clear from the cursor to the beginning of line\u003cbr /\u003e\n\u003cb\u003ectrl+y\u003c/b\u003e\u0026nbsp; redo current line text from ctrl+u\u003cbr /\u003e\n\u003cb\u003ectrl+l\u003c/b\u003e\u0026nbsp; clear screen (or use\u0026nbsp;\u003cb\u003eclear\u003c/b\u003e command). Just like cls in windows\u003cbr /\u003e\n\u003cb\u003ectrl+z\u003c/b\u003e\u0026nbsp; send the process to background. Useful when you have a long running process and need to work on terminal. type \u003cb\u003efg\u003c/b\u003e to bring it back\u003cbr /\u003e\n\u003cb\u003e\u003cbr /\u003e\u003c/b\u003e\nNot that useful on windows keyboard\u003cbr /\u003e\n\u003cb\u003ectrl+a\u0026nbsp; \u003c/b\u003emove the cursor to beginning of line. Or just use keyboard home button\u003cbr /\u003e\n\u003cb\u003ectrl+e\u003c/b\u003e\u0026nbsp; move the cursor to end of line. Or just use keyboard end button\u003cbr /\u003e\n\u003cbr /\u003e\nTips: \u003cb\u003etab\u003c/b\u003e button helps automatically complete the command or file. Just like in cmd or powershell.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003e\u003cu\u003eHelp / Information\u003c/u\u003e\u003c/b\u003e\u003cbr /\u003e\n\u003cb\u003eman\u003c/b\u003e\u0026nbsp; user manual or help for command. eg. \u003ci\u003eman ls \u003c/i\u003eshow details and usage of ls command. To get out of man page, type q\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003e\u003cu\u003eCommonly use command\u003c/u\u003e\u003c/b\u003e\u003cbr /\u003e\n\u003cb\u003epwd\u0026nbsp; \u003c/b\u003ecurrent folder path\u003cbr /\u003e\n\u003cb\u003els\u003c/b\u003e\u0026nbsp; list directory content. eg.\u003cbr /\u003e\n\u003cb\u003ell\u003c/b\u003e\u0026nbsp; is an alias of ls -l. It shows long listing format\u003cbr /\u003e\n\u003cb\u003els -al\u003c/b\u003e\u0026nbsp; includes showing long listing format including file starting with . (dot). This file is treated as hidden file.\u003cbr /\u003e\n\u003cb\u003e\u003cbr /\u003e\u003c/b\u003e\nUse ll command to list file / directory permission. More permission info \u003ca href\u003d\"https://www.linux.com/learn/understanding-linux-file-permissions\" target\u003d\"_blank\"\u003ehere\u003c/a\u003e. For example,\u003cbr /\u003e\n\u003cbr /\u003e\ndrwxr-xr-x\u0026nbsp; (a directory)\u003cbr /\u003e\n-rw-r--r--\u0026nbsp; (a file)\u003cbr /\u003e\n\u003cbr /\u003e\nr as read permission\u003cbr /\u003e\nw as modify permission\u003cbr /\u003e\nx as execute file or view content of directory permission\u003cbr /\u003e\n\u003cbr /\u003e\n1st character indicates a file or directory. d as directory, - as file\u003cbr /\u003e\n2-4 characters indicate the permission of owner of the file/directory\u003cbr /\u003e\n5-7 characters indicate the permission of the group of the file/directory\u003cbr /\u003e\n8-10 characters indicate the permission of all other users\u003cbr /\u003e\n\u003cdiv\u003e\n\u003cbr /\u003e\u003c/div\u003e\n\u003cdiv\u003e\n\u003cb\u003echown \u003c/b\u003euser1.group1\u0026nbsp;\u003cb\u003e\u0026lt;file/folder\u0026gt;\u003c/b\u003e\u0026nbsp; change ownership of files/directories. User, group, other (every users). Use -R as recursive for folder permission change\u003cbr /\u003e\n\u003cb\u003echmod 644\u003c/b\u003e\u0026nbsp;\u0026lt;file/folder\u0026gt;\u0026nbsp; change file/folder permission. eg. If this is a file, 6 as binary is 110, 4 as binary as 100. The file permission would be -rw-r--r--\u003c/div\u003e\n\u003cdiv\u003e\n\u003cb\u003esudo\u003c/b\u003e\u0026nbsp; execute a command as superuser or another user\u003cbr /\u003e\n\u003cb\u003esu\u003c/b\u003e\u0026nbsp; change user\u003cbr /\u003e\n\u003cdiv\u003e\n\u003cbr /\u003e\u003c/div\u003e\n\u003cb\u003ecd\u003c/b\u003e\u0026nbsp;change current directory. Just like windows\u003c/div\u003e\n\u003cb\u003ecd -\u0026nbsp; \u003c/b\u003e(dash), cd into previous folder (folder before the previous cd command)\u003cbr /\u003e\n\u003cb\u003ecd ~\u0026nbsp;\u003c/b\u003e (tilde), ~ refer to home directory. cd into home directory\u003cbr /\u003e\n\u003cb\u003ecd /\u003c/b\u003e\u0026nbsp; (slash), / is the root directory\u003cbr /\u003e\n\u003cb\u003emd\u003c/b\u003e\u0026nbsp; make (create) new directory\u003cbr /\u003e\n\u003cb\u003erm\u003c/b\u003e\u0026nbsp; remove folder or file. eg. rm \u0026lt;file\u0026gt;.. rm -r \u0026lt;folder\u0026gt; (r is to remove directory and its contents recursively)\u003cbr /\u003e\n\u003cb\u003ecp\u003c/b\u003e\u0026nbsp; copy file/folder\u003cbr /\u003e\n\u003cb\u003escp\u003c/b\u003e\u0026nbsp; secure copy file/folder. eg copying files between hosts using ssh. eg. scp file.txt username@destination_host:/folder/subfolder\u003cbr /\u003e\n\u003cb\u003etouch\u003c/b\u003e \u0026lt;file\u0026gt;\u0026nbsp; \u0026nbsp;create empty file. Or use text editor to create new file with content\u003cbr /\u003e\n\u003cb\u003e\u0026gt;\u0026nbsp;\u003c/b\u003e output to file. eg. ls \u0026gt; abc.txt\u003cbr /\u003e\n\u003cb\u003e\u0026gt;\u0026gt;\u0026nbsp;\u003c/b\u003e\u0026nbsp;append to file. eg. ls \u0026gt;\u0026gt; abc.txt\u003cbr /\u003e\n\u003cb\u003efind \u003c/b\u003e\u0026lt;directory\u0026gt;\u003cb\u003e -name \u003c/b\u003e\u0026lt;file/folder\u0026gt;\u0026nbsp; search file/folder in the directory. use -iname for insensitive search\u003cbr /\u003e\n\u003cb\u003elocate\u003c/b\u003e\u0026nbsp;\u0026lt;file\u0026gt;\u0026nbsp; quickly search through files index db on the system (maybe not up to date)\u003cbr /\u003e\n\u003cb\u003efile\u003c/b\u003e\u0026nbsp;\u0026lt;file\u0026gt;\u0026nbsp; show file type (eg. examine if a file is text or archive file)\u003cbr /\u003e\n\u003cb\u003egrep\u003c/b\u003e\u0026nbsp; search matching pattern or regular expression. Very useful for matching pattern. Eg. using in conjunction with ps -ef | grep java, to find java process\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003e\u003cu\u003eArchive/Compress\u003c/u\u003e\u003c/b\u003e\n\u003cb\u003etar\u003c/b\u003e\u0026nbsp; archive utility (to zip or/and compress). Eg.\u003cbr /\u003e\n\u003cb\u003etar -cvf\u003c/b\u003e\u0026nbsp; test.tar test1\u0026nbsp; (to archive test1 folder into a tar file. -c as archive, -v for verbose and -f for file\u003cbr /\u003e\n\u003cb\u003etar -xvf\u003c/b\u003e\u0026nbsp; test.tar (to extract all files from archive.tar)\u003cbr /\u003e\n\u003cb\u003etar -cvzf\u003c/b\u003e\u0026nbsp; test.tar.gz test1 (to archive test1 folder and compress into tar.gz file) -z for compress\u003cbr /\u003e\n\u003cb\u003etar -xvzf\u003c/b\u003e test.tar.gz (to extract and uncompress archive.tar.gz)\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003e\u003cu\u003eView / modify file content\u003c/u\u003e\u003c/b\u003e\u003cbr /\u003e\n\u003cb\u003ecat\u003c/b\u003e \u0026lt;file\u0026gt;\u0026nbsp; concatenate and show all contents\u003cbr /\u003e\n\u003cb\u003eless\u003c/b\u003e \u0026lt;file\u0026gt;\u0026nbsp; show content one page at a time. Scroll forward with f (or page down button) and scroll backward with b (or page up button).\u003cbr /\u003e\n\u003cb\u003eless -p\u003c/b\u003e \u0026lt;pattern\u0026gt;\u0026nbsp; \u0026lt;file\u0026gt;\u0026nbsp; find pattern in the file\u003cbr /\u003e\n\u003cb\u003enano\u003c/b\u003e\u0026nbsp; simple text editor. kind of like notepad. nano is pretty straight forward. For help, press ctrl+G. The ^ (caret) refers to ctrl key. M refer to alt key\u003cbr /\u003e\n\u003cb\u003evi\u003c/b\u003e\u0026nbsp; more advanced (could be unintuitive for beginner) text editor. vi has two modes of operation: command mode and insert mode\u003cbr /\u003e\n\u003cbr /\u003e\npress Esc key to put vi in command mode\u003cbr /\u003e\npress i key to put vi in insert mode\u003cbr /\u003e\n\u003cbr /\u003e\nvi command mode,\u003cbr /\u003e\n\u003cb\u003edd\u003c/b\u003e\u0026nbsp; delete line\u003cbr /\u003e\n\u003cb\u003eyy\u003c/b\u003e\u0026nbsp; copy line. Use mouse to select the word for\u003cbr /\u003e\n\u003cb\u003ep\u003c/b\u003e\u0026nbsp; paste\u003cbr /\u003e\n\u003cb\u003eu\u003c/b\u003e\u0026nbsp; undo\u003cbr /\u003e\n\u003cb\u003ectrl+r\u003c/b\u003e\u0026nbsp; redo\u003cbr /\u003e\n\u003cb\u003e1G\u003c/b\u003e\u0026nbsp; beginning of first line (or :0)\u003cbr /\u003e\n\u003cb\u003eG$\u003c/b\u003e\u0026nbsp; end of last line\u003cbr /\u003e\n\u003cb\u003e:/searchword\u003c/b\u003e\u0026nbsp; find word (n for next found word, shift+n for previous)\u003cbr /\u003e\n\u003cb\u003e:%s/\u0026lt;searchword\u0026gt;/\u0026lt;replaceword\u0026gt;/g\u003c/b\u003e\u0026nbsp; find searchword and replace with replaceword\u003cbr /\u003e\n\u003cb\u003e:q\u003c/b\u003e\u0026nbsp; exit\u003cbr /\u003e\n\u003cdiv\u003e\n\u003cb\u003e:q!\u003c/b\u003e\u0026nbsp; exit and cancel any change\u003c/div\u003e\n\u003cdiv\u003e\n\u003cb\u003e:wq\u003c/b\u003e\u0026nbsp; save and exit\u003c/div\u003e\n\u003cdiv\u003e\n\u003cbr /\u003e\u003c/div\u003e\n\u003cb\u003e\u003cu\u003eSystem (storage, process, services, etc)\u003c/u\u003e\u003c/b\u003e\u003cbr /\u003e\n\u003cb\u003euname -a\u003c/b\u003e\u0026nbsp; show all system information eg linux version\u003cbr /\u003e\n\u003cb\u003edf -h\u003c/b\u003e\u0026nbsp; display file system disk space usage. -h for human readable size (eg. MB, GB instead of bytes)\u003cbr /\u003e\n\u003cb\u003edu -sh\u003c/b\u003e\u0026nbsp;\u0026lt;folder\u0026gt;\u0026nbsp; display file usage of a folder. -s for summary and -h for human readable size\u003cbr /\u003e\n\u003cb\u003eps -ef\u003c/b\u003e\u0026nbsp; display a snapshot of current process. -ef to show all process using standard syntax\u003cbr /\u003e\n\u003cb\u003etop\u003c/b\u003e\u0026nbsp; display linux processes. Like task manager in windows. Example,\u003cbr /\u003e\nh for help\u003cbr /\u003e\nz highlight\u003cbr /\u003e\nc show absolute path\u003cbr /\u003e\nk to kill process pid\u003cbr /\u003e\nu filter user\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003esystemctl -at service \u003c/b\u003esystemctl is used to manage services. Show all services\u003cbr /\u003e\n\u003cb\u003esystemctl -t service --state\u003dactive\u003c/b\u003e Show only active services\u003cbr /\u003e\n\u003cb\u003esystemctl status \u0026lt;service\u0026gt;\u003c/b\u003e\u0026nbsp; show service status (eg. enabled, active)\u003cbr /\u003e\n\u003cb\u003esudo systemctl start \u0026lt;service\u0026gt;\u003c/b\u003e\u0026nbsp; start specific service\u003cbr /\u003e\n\u003cb\u003esudo systemctl stop \u0026lt;service\u0026gt;\u0026nbsp; \u003c/b\u003estop specific service\u003cbr /\u003e\n\u003cbr /\u003e\nMore administrative commands could be found on this \u003ca href\u003d\"https://access.redhat.com/articles/1189123\" target\u003d\"_blank\"\u003epage\u003c/a\u003e."},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/4672753049553847595/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2018/06/windows-guy-learning-linux-basic.html#comment-form","title":"0 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/4672753049553847595"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/4672753049553847595"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2018/06/windows-guy-learning-linux-basic.html","title":"Windows Guy Learning Linux - Basic Command for Linux"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"media$thumbnail":{"xmlns$media":"http://search.yahoo.com/mrss/","url":"https://4.bp.blogspot.com/-jOOXd472bn4/WzJL9qKCS9I/AAAAAAAAPcU/qpXpQfe5bVcKvvQjyJWtX17KntchyCoAgCLcBGAs/s72-c/ilovelinux.jpg","height":"72","width":"72"},"thr$total":{"$t":"0"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-4426617738316451871"},"published":{"$t":"2015-12-31T14:33:00.001-06:00"},"updated":{"$t":"2015-12-31T15:02:09.129-06:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"Self Learning"}],"title":{"type":"text","$t":"Microsoft Recertification"},"content":{"type":"html","$t":"\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-iQkcOG_et8E/VoKbKHMSx7I/AAAAAAAAEDw/lGbWfACKEbM/s1600/firework.jpg\" imageanchor\u003d\"1\" style\u003d\"clear: left; float: left; margin-bottom: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"132\" src\u003d\"http://3.bp.blogspot.com/-iQkcOG_et8E/VoKbKHMSx7I/AAAAAAAAEDw/lGbWfACKEbM/s200/firework.jpg\" width\u003d\"200\" /\u003e\u003c/a\u003e\u003c/div\u003e\nYear 2016 is just around the corner. People are in holiday mood and in fact a lot of folks are on vacation at this time around. Project usually take this slow period into account and avoid any heavy personnel involvement and introduce changes only where necessary. This slow time of the year could be a great opportunity to expand knowledge and skills, and possibly look into certification / recertification.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca name\u003d'more'\u003e\u003c/a\u003eAround 2012, Microsoft implemented a recertification requirement for Microsoft Certified Solution Expert (MCSE) certification and Microsoft Certified Solution Developer (MCSD). As oppose to the 'retired' certificates (eg. MCITP SQL Server) that doesn't expire (but will be changed to legacy status), the new MCSE certification expires in 3 years (MCSD in 2 years) and require recertification to maintain active status.\u003cbr /\u003e\n\u003cbr /\u003e\nAll MCSE recertification could be completed by passing the respective recertification exam. For example, Exam 70-469 for recertification for MCSE Data Platform. Besides taking the recertification exam, there is another option Microsoft \u003ca href\u003d\"https://borntolearn.mslearn.net/b/weblog/archive/2015/04/02/introducing-recertification-through-microsoft-virtual-academy\" target\u003d\"_blank\"\u003eannounced\u003c/a\u003e early this year (2015) for\u003ca href\u003d\"https://www.microsoft.com/en-us/learning/recertification-virtual-academy.aspx\" target\u003d\"_blank\"\u003e recertification through Microsoft Virtual Academy\u003c/a\u003e (MVA). However this option is currently (as of end of 2015) available for only certain MCSE. That includes,\u003cbr /\u003e\n\u003cbr /\u003e\nMCSE: Data Platform\u003cbr /\u003e\nMCSE: Business Intelligence\u003cbr /\u003e\nMCSE: Communication\u003cbr /\u003e\nMCSE: Messaging\u003cbr /\u003e\nMCSE: SharePoint\u003cbr /\u003e\n\u003cbr /\u003e\nFor recertification through MVA, it requires passing all the module assessment for the course. Below is an example showing the modules required for MCSE Data Platform.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-l8vzJFSx5Dk/VoGLH8TK0GI/AAAAAAAAEDU/GejQohUsRKk/s1600/recertification.png\" imageanchor\u003d\"1\"\u003e\u003cimg border\u003d\"0\" height\u003d\"404\" src\u003d\"http://1.bp.blogspot.com/-l8vzJFSx5Dk/VoGLH8TK0GI/AAAAAAAAEDU/GejQohUsRKk/s640/recertification.png\" width\u003d\"640\" /\u003e\u003c/a\u003e\u003cbr /\u003e\n\u003cbr /\u003e\nSo which option is right for you?\u003cbr /\u003e\n\u003cbr /\u003e\nRecertification through recertification exam usually require only passing 1 exam. \u0026nbsp;The exam cost $150 USD. The exam usually takes 3-4 hours. Certain exam is available at certified test center throughout the word. Certain exam is also available as online proctored exam delivery. To prepare for the exam, exam taker could take advantage of practice test, prep videos or even taking instructor-led training with Microsoft learning partners. Also, Microsoft offers discount or \u003ca href\u003d\"https://www.blogger.com/\"\u003e\u003cspan id\u003d\"goog_356747239\"\u003e\u003c/span\u003eSecond Shot\u003cspan id\u003d\"goog_356747240\"\u003e\u003c/span\u003e\u003c/a\u003e (free retake) to these exam from times to times. There is a \u003ca href\u003d\"http://between%20july%2012%2C%202015%2C%20and%20january%2012%2C%202016/\" target\u003d\"_blank\"\u003epromotion\u003c/a\u003e offered currently for certain exam scheduled between July 12, 2015 to Jan 12, 2016.\u003cbr /\u003e\n\u003cbr /\u003e\nFor recertification through MVA, since not all MCSE (or MCSD) is currently available, this maybe an option only for certain folks, at least for now. As seen in the example above, there are multiple modules. Each module consists of multiple parts with videos, presentation and assessment. All assessment required to be completed. MVA courses are available online and you can complete each module at your own pace. There isn't any monetary cost to sign up with MVA, viewing the module content as well as taking the assessment. In addition to taking the required module for recertitication, there are many modules for anyone to learn and expand their knowledge. Keep in mind that upon completion of MVA course, you need to submit your MVA user name and Microsoft Certification ID (MC ID) to certquest@microsoft.com for them to verify and update your transcript. There is also MVA community to provide support and helps for your questions.\u003cbr /\u003e\n\u003cbr /\u003e\nSo, pick the option you seem more suitable for you. I chose the MVA route this time around as I wanted to explore this new option and the modules offered. My experience has been that while some modules emphasize on targeted expertise and new features, some modules also broaden to different areas as well as bring awareness of new product (eg. machine learning, azure platform). And yes, I am recertified as MCSE Data Platform and MCSE Business Intelligence.\u003cbr /\u003e\n\u003cbr /\u003e\nHappy New Year!"},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/4426617738316451871/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2015/12/microsoft-recertification.html#comment-form","title":"0 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/4426617738316451871"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/4426617738316451871"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2015/12/microsoft-recertification.html","title":"Microsoft Recertification"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"media$thumbnail":{"xmlns$media":"http://search.yahoo.com/mrss/","url":"http://3.bp.blogspot.com/-iQkcOG_et8E/VoKbKHMSx7I/AAAAAAAAEDw/lGbWfACKEbM/s72-c/firework.jpg","height":"72","width":"72"},"thr$total":{"$t":"0"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-6655396350232269497"},"published":{"$t":"2015-06-29T14:16:00.002-05:00"},"updated":{"$t":"2015-06-29T21:05:52.470-05:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"Replication"},{"scheme":"http://www.blogger.com/atom/ns#","term":"Troubleshooting"}],"title":{"type":"text","$t":"SQL Replication - Subscriber On Different Domain"},"content":{"type":"html","$t":"Setting up SQL Server replication with publisher / distributor on one domain and subscriber on another domain could be tricky. It could be even more interesting when these servers have the same host name. This blog post discuss issues encountered and the steps of addressing them.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca name\u003d'more'\u003e\u003c/a\u003e\u003cbr /\u003e\n\u003cb\u003ePreparation\u003c/b\u003e\n\u003cbr /\u003e\nIn this example, two servers with SQL Server installed are setup in Microsoft Azure. Both using different cloud service (different domain) but within the same virtual network. In Microsoft Azure, when creating a virtual machine in a cloud service, Azure assigns the VM to a subdomain of cloudapp.net. Here is the server information.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-LGZ-NrviG70/VZGe3ni3q9I/AAAAAAAADxs/q5prqZ1ZxII/s1600/SqlTopology.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"197\" src\u003d\"http://2.bp.blogspot.com/-LGZ-NrviG70/VZGe3ni3q9I/AAAAAAAADxs/q5prqZ1ZxII/s400/SqlTopology.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nThe first VM (Server A)\u003cbr /\u003e\nHost Name: SQL2014\u003cbr /\u003e\nDomain:\u0026nbsp;CloudLan.d10.internal.cloudapp.net\u003cbr /\u003e\nIP address: 10.0.0.5\u003cbr /\u003e\n\u003cbr /\u003e\nThe second VM (Server B)\u003cbr /\u003e\nHost Name: SQL2014\u003cbr /\u003e\nDomain: MoonLan.d10.internal.cloudapp.net\u003cbr /\u003e\nIP address: 10.0.0.4\u003cbr /\u003e\n\u003cbr /\u003e\nThere are both hosted on the same virtual network.\u003cbr /\u003e\n\u003cbr /\u003e\nA database, TEST has been previously setup with a publication (PUB_C) and configured for transactional replication. Refer to this \u003ca href\u003d\"http://www.travisgan.com/2015/06/sql-server-replication-case-study.html\" target\u003d\"_blank\"\u003epost\u003c/a\u003e\u0026nbsp;for setup.\u003cbr /\u003e\n\u003cdiv\u003e\n\u003cbr /\u003e\u003c/div\u003e\n\u003cdiv\u003e\n\u003cb\u003eCommunication Verification\u003c/b\u003e\u003c/div\u003e\nThe FQDN (Fully Qualified Domain Name) is required to be used for these two servers communication as they are on the different domain.\u003cbr /\u003e\n\u003cbr /\u003e\nAttempt to ping Server B (with its FQDN) from Server A successful. Verified that they are able to communicate to each other. Connecting to Server B default SQL server instance from Server A successful with the FQDN of Server B.\u003cbr /\u003e\n\u003cbr /\u003e\nPerform the same test and verify the connecting status on Server B connecting to Server A default SQL Server instance successful. So far so good. Each server is able to communicate to the other server as long as server FQDN is used.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eReplication - Adding subscription\u003c/b\u003e\u003cbr /\u003e\nOn SQL Server A, attempt to add SQL Server B as subscriber to this publication\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-wUag5fTOB3Y/VZFe6R0GrPI/AAAAAAAADwI/fCovMyfqkBc/s1600/SqlSubcription1.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"400\" src\u003d\"http://2.bp.blogspot.com/-wUag5fTOB3Y/VZFe6R0GrPI/AAAAAAAADwI/fCovMyfqkBc/s400/SqlSubcription1.png\" width\u003d\"372\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-KiFOVL2CFzA/VZFe6UnZnhI/AAAAAAAADvk/29TGpYCt3o0/s1600/SqlSubcription2.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"257\" src\u003d\"http://4.bp.blogspot.com/-KiFOVL2CFzA/VZFe6UnZnhI/AAAAAAAADvk/29TGpYCt3o0/s400/SqlSubcription2.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nIn this example, the Push subscription method is used\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-I7KviYSfN2k/VZFe6YGgfzI/AAAAAAAADvg/jkMilNZMX_c/s1600/SqlSubcription3.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"155\" src\u003d\"http://1.bp.blogspot.com/-I7KviYSfN2k/VZFe6YGgfzI/AAAAAAAADvg/jkMilNZMX_c/s400/SqlSubcription3.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-PAOEUsF_LjI/VZFe7MOXLaI/AAAAAAAADvo/HVzPkTGTTyI/s1600/SqlSubcription4.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"325\" src\u003d\"http://3.bp.blogspot.com/-PAOEUsF_LjI/VZFe7MOXLaI/AAAAAAAADvo/HVzPkTGTTyI/s400/SqlSubcription4.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nConnecting the Server B default SQL Server instance with the FQDN since it is on different domain\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-ywBA5NwGwbA/VZFe8eauFRI/AAAAAAAADvs/54gKMp8kps8/s1600/SqlSubcription5.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"173\" src\u003d\"http://4.bp.blogspot.com/-ywBA5NwGwbA/VZFe8eauFRI/AAAAAAAADvs/54gKMp8kps8/s400/SqlSubcription5.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nError!\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-C16A2ngRlVA/VZFe82xh3yI/AAAAAAAADvw/28O3nR6SCAM/s1600/SqlSubcriptionError.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"111\" src\u003d\"http://3.bp.blogspot.com/-C16A2ngRlVA/VZFe82xh3yI/AAAAAAAADvw/28O3nR6SCAM/s400/SqlSubcriptionError.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cspan style\u003d\"color: red;\"\u003eCannot connect to SQL2014.MoonLan.d9.internal.cloudapp.net.\u003c/span\u003e\u003cbr /\u003e\n\u003cspan style\u003d\"color: red;\"\u003e\u003cbr /\u003e\u003c/span\u003e\n\u003cspan style\u003d\"color: red;\"\u003e------------------------------\u003c/span\u003e\u003cbr /\u003e\n\u003cspan style\u003d\"color: red;\"\u003eADDITIONAL INFORMATION:\u003c/span\u003e\u003cbr /\u003e\n\u003cspan style\u003d\"color: red;\"\u003e\u003cbr /\u003e\u003c/span\u003e\n\u003cspan style\u003d\"color: red;\"\u003eSQL Server replication requires the actual server name to make a connection to the server. Specify the actual server name, 'SQL2014'. (Replication.Utilities)\u003c/span\u003e\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nIf we read the error message, it states that the actual server name 'SQL2014' is required. It appears that the subscriber server name (server B) used should be the host name of the server, rather than the FQDN name. Or is it?\u003cbr /\u003e\n\u003cbr /\u003e\nLet's investigate. On Server B, run this command,\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eSELECT @@SERVERNAME;\nGO\nSELECT *\nFROM sys.servers;\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003c/div\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-6WZTDecAanM/VZFv90gszcI/AAAAAAAADwo/VBbvePJU5Hs/s1600/sqlsysservers.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"106\" src\u003d\"http://3.bp.blogspot.com/-6WZTDecAanM/VZFv90gszcI/AAAAAAAADwo/VBbvePJU5Hs/s320/sqlsysservers.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nThe sys.servers system table stores the local SQL Server instance name (when server_id \u003d 0) and linked servers (if any, for server_id \u0026gt; 0). The current result for local instance is SQL2014.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eA word of caution for the steps below. You do not need to change the sys.servers usually unless the entry in sys.servers is different from the server host name. You may just need to add an alias on SQL Server A (Go to '\u003c/b\u003e\u003cb\u003eAdd an Alias for SQL Server B' section below)\u003c/b\u003e\u003cb\u003e. More information on the conclusion section below.\u003c/b\u003e\u003cbr /\u003e\n\u003cbr /\u003e\nTo verify if SQL replication subscriber error is referring to the host name or this entry in the sys.servers, if we change this record into SQL2014B (with the B),\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eEXEC sp_dropserver 'SQL2014'\nGO\nEXEC sp_addserver 'SQL2014B', local;\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\nNotice that @@SERVERNAME does not reflect the change. Let's restart the SQL Server service on Server B. Now both\u0026nbsp;@@SERVERNAME and the sys.servers return SQL2014B. A reminder that, the server name (host name) has not been changed. It is still SQL2014.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:ps\"\u003e$env:COMPUTERNAME\n\u003c/pre\u003e\n\u003cbr /\u003e\nResult returns as SQL2014\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eSetting up an Alias for SQL Server B\u003c/b\u003e\u003cbr /\u003e\nBack to Server A. We need to connect to the SQL Server B instance (SQL2014B) on Server B (SQL2014). One way to accomplish this successfully is to create a SQL Server alias for SQL Server instance (SQL2014B) on Server A. SQL Server Alias provides alternate name to be used to connect to the target SQL Server. Adding SQL Server alias can be done with SQL Native Client in the SQL Server configuration manager, and in this case on Server A.\u003cbr /\u003e\n\u003cbr /\u003e\nNote that SSMS is running in 32-bit, while the SQL Server in this case is installed with 64-bit version. While the replication agent process is called from SQL Server agent, the replication process run in 64-bit. So, in this case we will setup an alias in both 32-bit and 64-bit SQL Native Client configuration.\u003cbr /\u003e\n\u003cbr /\u003e\nAlias: SQL2014B \u0026nbsp; \u0026nbsp; (Same as the record in sys.servers on SQL Server B)\u003cbr /\u003e\nPort: 1433 \u0026nbsp; (SQL Server default port since it's default instance)\u003cbr /\u003e\nProtocol: TCP/IP\u003cbr /\u003e\nServer: SQL2014.MoonLan.d10.internal.cloudapp.net \u0026nbsp; \u0026nbsp; (This is Server B FQDN)\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-CpqA7XD_x0E/VZFvwGVGksI/AAAAAAAADwg/AFvUM6zItj4/s1600/SqlAlias1.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://2.bp.blogspot.com/-CpqA7XD_x0E/VZFvwGVGksI/AAAAAAAADwg/AFvUM6zItj4/s1600/SqlAlias1.png\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003c/div\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-WqP05m4BzgI/VZGabU9oPMI/AAAAAAAADxg/Gddz7-teuI0/s1600/SqlAlias2.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"137\" src\u003d\"http://3.bp.blogspot.com/-WqP05m4BzgI/VZGabU9oPMI/AAAAAAAADxg/Gddz7-teuI0/s320/SqlAlias2.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nNow back to addition the replication publication subscriber screen. Enter SQL2014B.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-nCpsfwTTKTo/VZFxi5duP-I/AAAAAAAADxA/IdW5ZsYfKMU/s1600/SqlSubcription5-1.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"172\" src\u003d\"http://1.bp.blogspot.com/-nCpsfwTTKTo/VZFxi5duP-I/AAAAAAAADxA/IdW5ZsYfKMU/s400/SqlSubcription5-1.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nNow it works.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003c/div\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-CNjGTMxMB5M/VZFxP0RLQXI/AAAAAAAADww/XvSPB9SNPOQ/s1600/SqlSubcription6.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"157\" src\u003d\"http://2.bp.blogspot.com/-CNjGTMxMB5M/VZFxP0RLQXI/AAAAAAAADww/XvSPB9SNPOQ/s400/SqlSubcription6.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cb\u003eObservation\u003c/b\u003e\u003cbr /\u003e\nFrom we just observed, the server name enter during the subscriber addition needs to be same as the local server entry in the sys.servers of the subscriber. It has nothing to do with the host name of the subscriber server. The local server entry may differs from the host name due to host name (computer name) changes after SQL Server installation. Or like this case it was changed intentionally.\u003cbr /\u003e\n\u003cbr /\u003e\nIf you didn't setup the alias in 32-bit SQL Native Client configuration, you may continue to encounter the error during setup. Vice versa, if you didn't setup the alias in 64-bit SQL Native Client configuration on 64-bit SQL Server, you may be to complete adding the subscriber, but replication encounter error during synchronization to subscriber.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-hWqZ11B7sSM/VZF-ytZPs_I/AAAAAAAADxQ/UBjXwxzagmM/s1600/SqlSyncError.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"150\" src\u003d\"http://4.bp.blogspot.com/-hWqZ11B7sSM/VZF-ytZPs_I/AAAAAAAADxQ/UBjXwxzagmM/s320/SqlSyncError.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cb\u003eConclusion\u003c/b\u003e\u003cbr /\u003e\nTo regroup, the example above address two issues.\u003cbr /\u003e\n\u003cbr /\u003e\n- First, as we see that during the subscriber addition, the server name entered needs to be the same as the local server entry in the sys.servers of the subscriber.\u003cbr /\u003e\n- The second issue in this example is that the host name is the same on both servers, SQL2014 with different FQDN (this issue is not common)\u003cbr /\u003e\n\u003cbr /\u003e\nTo address the first issue, all we have to do is to create an alias (the entry as in sys.servers, usually same as its server host name) for SQL Server B on Server A (the connecting side). Setting an alias redirect the SQL communication of the alias to the actual SQL Server instance (with FQDN).\u003cbr /\u003e\n\u003cbr /\u003e\nHowever, to address the second issue in this example, it becomes a little tricky. We could change the record in sys.servers of the SQL Server B to be something different (like we saw, SQL2014B) without changing the host name of the server. However, having a different name for SQL Server default instance and the server host name usually is not preferable. It could cause a lot of confusion and potential break some codes and setting configured previously. This may works when it is not an option to change the host name. With that said, always try to have the default SQL Server instance name similar as the server host name.\u003cbr /\u003e\n\u003cbr /\u003e\nNote. Changing the host name (compute name) of the server does not change the entry in the sys.servers. You will need to perform the sp_dropserver and sp_addserver steps as shown above to change the SQL Server instance name in sys.servers to match the host name.\u003cbr /\u003e\n\u003cbr /\u003e\nThis example setup the Push subscription from SQL Server A. As a result, the alias of SQL Server B is setup on the SQL Server A. If you are setting Pull subscription from SQL Server B, the alias of SQL Server A will need to be setup on SQL Server B. Just remember that the alias (of the receiving side) is setup on the connecting side (the connection initializing side)."},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/6655396350232269497/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2015/06/sql-replication-subscriber-on-different.html#comment-form","title":"1 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/6655396350232269497"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/6655396350232269497"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2015/06/sql-replication-subscriber-on-different.html","title":"SQL Replication - Subscriber On Different Domain"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"media$thumbnail":{"xmlns$media":"http://search.yahoo.com/mrss/","url":"http://2.bp.blogspot.com/-LGZ-NrviG70/VZGe3ni3q9I/AAAAAAAADxs/q5prqZ1ZxII/s72-c/SqlTopology.png","height":"72","width":"72"},"thr$total":{"$t":"1"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-2399833994167725913"},"published":{"$t":"2015-06-25T07:00:00.000-05:00"},"updated":{"$t":"2015-06-28T22:54:54.407-05:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"Replication"}],"title":{"type":"text","$t":"SQL Server Replication - Case Study \u0026 Implementation"},"content":{"type":"html","$t":"This post discuss of evaluating a business case, with designing and implementing SQL Server replication in a step-by-step tutorial to address the requirement.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca name\u003d'more'\u003e\u003c/a\u003e\u003cb\u003eBusiness Case\u003c/b\u003e\u003cbr /\u003e\nThere are two SQL Servers. One hosted on Server A and one hosted on Server B. These two servers are stand alone server on a similar network but not joined to the same domain. Each SQL Server has a database with similar list of tables. Within this list of tables, a set of tables (A_Tbl*) have read/write transactions performed on the database hosted on SQL Server A, with its data also available on SQL Server B for read only purpose. Similarly, there are another set of tables (B_Tbl*) in the database that have read/write transactions performed on SQL Server B, with the data also available on SQL Server A for read only purpose. Any data change occurred on any table of that database on one SQL Server should be promptly reflected on the other SQL Server.\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003c/div\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-wLrGChJqqqc/VYuS-pBuoDI/AAAAAAAADuw/gbDmfa5GYnQ/s1600/UseCase.PNG\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"286\" src\u003d\"http://3.bp.blogspot.com/-wLrGChJqqqc/VYuS-pBuoDI/AAAAAAAADuw/gbDmfa5GYnQ/s320/UseCase.PNG\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\nSQL Server replication stand out to be a good solution to address this business and technical requirement. Replication topology could includes servers that are not in the same domain. The servers do not need to be in a cluster environment or require any domain account. SQL Server replication also provides option for replicating selected objects (tables) to another SQL Server. The receiving database on the other SQL Server is available for use even when new data is being replicated over.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eDesign \u0026amp; Implementation\u003c/b\u003e\u003cbr /\u003e\nTransactional replication is chosen to allows data to be replicated in a continuous manner. The replication topology design in this case have each publisher utilize its own distributor for its publication. The distributor is hosted on the same server as the publisher. Push subscription method is chosen on each server to push the publication articles to another server. A local Windows account will be created and used for replication agent process account as well as connecting to SQL Server.\u003cbr /\u003e\n\u003cbr /\u003e\nSQL Server transactional replication is implemented by multiple agents, there are snapshot agent, log reader agent and distribution agent. Here are some description and requirement of each agent.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eSnapshot agent\u003c/b\u003e\u003cbr /\u003e\n- Prepare snapshot files (schema, script, indexes, data, etc), and record synchronization status in distribution database. Snapshot files are used to initiate subscriber for transactional replication and also used for other replication\u003cbr /\u003e\n- Run at distributor\u003cbr /\u003e\n- Connect to publisher either with a Windows account or SQL account. Connecting account at least db_owner database role in publication database\u003cbr /\u003e\n- Connect to distributor with a Windows account (process account). Process account at least db_owner database role in\u0026nbsp;distribution\u0026nbsp;database. Process account has write permission to snapshot share\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eLog reader agent\u003c/b\u003e\u003cbr /\u003e\n- Monitor transaction log on publication database and copy transaction marked for replication to distribution database\u003cbr /\u003e\n- Run at distributor\u003cbr /\u003e\n- Connect to Publisher either with a Windows account or SQL account. Windows account will be used in this case. The same Windows account (connecting account) with at least db_owner database role in publication database\u003cbr /\u003e\n- Connect to Distributor with a Windows account (process account). Process account with at least db_owner database role in\u0026nbsp;distribution\u0026nbsp;database\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eDistributor agent\u0026nbsp;\u003c/b\u003e\u003cb\u003e(\u003c/b\u003e\u003ci\u003e\u003cb\u003ePush subscription)\u003c/b\u003e\u003c/i\u003e\u003cbr /\u003e\n- Move snapshot in transaction stored in distribution database to the subscriber\u003cbr /\u003e\n- Run at distributor\u003cbr /\u003e\n- Connect to distributor with a Windows account (process account). Process account at least db_owner database role in\u0026nbsp;distribution\u0026nbsp;database. Process account has read permission to snapshot share. Process account a member of PAL\u0026nbsp;(Publication Access List)\u003cbr /\u003e\n-\u0026nbsp;Connect to subscriber with either with a Windows account or SQL account. Windows account will be used in this case. The same Windows account (connecting account) with at least db_owner database role in subscription\u0026nbsp;database\u003cbr /\u003e\n\u003cbr /\u003e\nThis is the design layout,\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003c/div\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-r47esowvvkU/VYuNGyuXqlI/AAAAAAAADug/f9VBLFgIazg/s1600/Design.PNG\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"322\" src\u003d\"http://2.bp.blogspot.com/-r47esowvvkU/VYuNGyuXqlI/AAAAAAAADug/f9VBLFgIazg/s400/Design.PNG\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nAs the distributor and publisher of each publication are hosted on the same server, for simplicity, one local Windows account is used for snapshot agent, log reader agent and distributor agent on each server. As both servers are not on the same domain, one Windows account with similar name and password will be used for authentication purpose.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003ePreparation\u003c/b\u003e\u003cbr /\u003e\nIn this example, we will create a database called TEST on each SQL Server. The actual server name in the example is SQL2008R2 (acted as Server A) and SQL2008R2A (acted as Server B). Each server only have one default SQL Server instance.\u003cbr /\u003e\n\u003cbr /\u003e\nThe SQL Server agent service startup type on each server has been changed to automatic. A firewall rule to allow TCP port 1433 has been created on each server for SQL Server communication.\u003cbr /\u003e\n\u003cbr /\u003e\nLets create the database and tables. Connect to SQL Server A and run the database and table creation script. Insert some data into A_tbl tables\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eUSE master;\nGO\nCREATE DATABASE TEST;\nGO\nUSE TEST;\nGO\nCREATE TABLE [dbo].[A_tbl1] (\n [ID] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY CLUSTERED,\n [Desc] [varchar](50) NULL ) \nGO\nCREATE TABLE [dbo].[A_tbl2] (\n [ID] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY CLUSTERED,\n [Desc] [varchar](50) NULL ) \nGO\nCREATE TABLE [dbo].[A_tbl3] (\n [ID] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY CLUSTERED,\n [Desc] [varchar](50) NULL ) \nGO\nCREATE TABLE [dbo].[B_tbl1] (\n [ID] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY CLUSTERED,\n [Desc] [varchar](50) NULL ) \nGO\nCREATE TABLE [dbo].[B_tbl2] (\n [ID] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY CLUSTERED,\n [Desc] [varchar](50) NULL ) \nGO\nCREATE TABLE [dbo].[B_tbl3] (\n [ID] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY CLUSTERED,\n [Desc] [varchar](50) NULL ) \nGO\nINSERT INTO dbo.A_tbl1 VALUES ('Hello'), ('Hi');\nINSERT INTO dbo.A_tbl2 VALUES ('Morning'), ('Night');\nINSERT INTO dbo.A_tbl3 VALUES ('SQL'), ('ORACLE');\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\nOn SQL Server B, create the database and same tables. Insert some data into B_tbl tables\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eUSE master;\nGO\nCREATE DATABASE TEST;\nGO\nUSE TEST;\nGO\nCREATE TABLE [dbo].[A_tbl1] (\n [ID] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY CLUSTERED,\n [Desc] [varchar](50) NULL ) \nGO\nCREATE TABLE [dbo].[A_tbl2] (\n [ID] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY CLUSTERED,\n [Desc] [varchar](50) NULL ) \nGO\nCREATE TABLE [dbo].[A_tbl3] (\n [ID] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY CLUSTERED,\n [Desc] [varchar](50) NULL ) \nGO\nCREATE TABLE [dbo].[B_tbl1] (\n [ID] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY CLUSTERED,\n [Desc] [varchar](50) NULL ) \nGO\nCREATE TABLE [dbo].[B_tbl2] (\n [ID] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY CLUSTERED,\n [Desc] [varchar](50) NULL ) \nGO\nCREATE TABLE [dbo].[B_tbl3] (\n [ID] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY CLUSTERED,\n [Desc] [varchar](50) NULL ) \nGO\nINSERT INTO dbo.B_tbl1 VALUES ('Yes'), ('No');\nINSERT INTO dbo.B_tbl2 VALUES ('Food'), ('Drink');\nINSERT INTO dbo.B_tbl3 VALUES ('See ya'), ('Bye');\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\nOn Server A,\u003cbr /\u003e\nCreates a Windows account for replication agents through computer management\u003cbr /\u003e\nAccount: sqlreplication\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-_TbUB8lVXk0/VYsJ22DItUI/AAAAAAAADm4/xoJgA4QsHjU/s1600/WindowsAcctCreation.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"340\" src\u003d\"http://1.bp.blogspot.com/-_TbUB8lVXk0/VYsJ22DItUI/AAAAAAAADm4/xoJgA4QsHjU/s400/WindowsAcctCreation.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nGrant the Windows account created read / write access to its snapshot replication folder (on its server). In this example, the default location is used. The replication folder is designated during replication setup.\u003cbr /\u003e\n\u003cdiv\u003e\n\u003cbr /\u003e\neg. C:\\Program Files\\Microsoft SQL Server\\MSSQL10_50.MSSQLSERVER\\MSSQL\\repldata\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-DuBzKpNWENk/VYsKargWOwI/AAAAAAAADnA/lEhW2d-tJvo/s1600/SnapshotFolderPermission.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"400\" src\u003d\"http://3.bp.blogspot.com/-DuBzKpNWENk/VYsKargWOwI/AAAAAAAADnA/lEhW2d-tJvo/s400/SnapshotFolderPermission.png\" width\u003d\"353\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nCreate the account in the SQL Server A for the Windows account just created. Grant the local Windows account db_owner role on the publication database (in this case also a subscription database for other publication), TEST database.\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-Kff23KgyIFo/VYsLeOzX6dI/AAAAAAAADnM/Z1zxUgZvCVU/s1600/sqlreplicationLogin.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"95\" src\u003d\"http://2.bp.blogspot.com/-Kff23KgyIFo/VYsLeOzX6dI/AAAAAAAADnM/Z1zxUgZvCVU/s400/sqlreplicationLogin.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-cuKl6dEZxnI/VYsLfVbCBzI/AAAAAAAADnU/qnloGBCWRNk/s1600/sqlreplicationDBRole.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"400\" src\u003d\"http://1.bp.blogspot.com/-cuKl6dEZxnI/VYsLfVbCBzI/AAAAAAAADnU/qnloGBCWRNk/s400/sqlreplicationDBRole.png\" width\u003d\"397\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003cbr /\u003e\u003c/div\u003e\nOn Server B,\u003cbr /\u003e\nPerform the same action like Server A.\u003cbr /\u003e\n\u003cbr /\u003e\nCreates a Windows account with the \u003cb\u003esame name and same password\u003c/b\u003e like Server A for replication agents through computer management\u003cbr /\u003e\nAccount: sqlreplication\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-_TbUB8lVXk0/VYsJ22DItUI/AAAAAAAADm8/nu4TewZP_v0/s1600/WindowsAcctCreation.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"340\" src\u003d\"http://1.bp.blogspot.com/-_TbUB8lVXk0/VYsJ22DItUI/AAAAAAAADm8/nu4TewZP_v0/s400/WindowsAcctCreation.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nGrant the Windows account just created with read / write access to its snapshot replication folder (on its server). In this example, the default location is used. The replication folder is designated during replication setup.\u003cbr /\u003e\n\u003cdiv\u003e\n\u003cbr /\u003e\neg. C:\\Program Files\\Microsoft SQL Server\\MSSQL10_50.MSSQLSERVER\\MSSQL\\repldata\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-p26eM3y5C7Q/VYsMjjaIO8I/AAAAAAAADnc/WKNo0P4UNK8/s1600/SnapshotFolderPermissionB.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"400\" src\u003d\"http://1.bp.blogspot.com/-p26eM3y5C7Q/VYsMjjaIO8I/AAAAAAAADnc/WKNo0P4UNK8/s400/SnapshotFolderPermissionB.png\" width\u003d\"352\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nCreate the account in the SQL Server B for the Windows account just created. Grant the local Windows account db_owner role on the publication database (in this case also a subscription database for other publication), TEST database.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-OfensxEUTA0/VYsPfyn6-XI/AAAAAAAADno/crHaIcs8JmM/s1600/sqlreplicationLoginB.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"88\" src\u003d\"http://1.bp.blogspot.com/-OfensxEUTA0/VYsPfyn6-XI/AAAAAAAADno/crHaIcs8JmM/s400/sqlreplicationLoginB.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-Bovhj8lChkU/VYsPf5h6yjI/AAAAAAAADns/9lfEZHFSHzk/s1600/sqlreplicationDBRoleB.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"396\" src\u003d\"http://3.bp.blogspot.com/-Bovhj8lChkU/VYsPf5h6yjI/AAAAAAAADns/9lfEZHFSHzk/s400/sqlreplicationDBRoleB.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cb\u003eSQL Replication Implementation - Publication A\u003c/b\u003e\u003cbr /\u003e\nNow we are ready to setup the replication. Let's setup the publication A (PUB_A) on SQL Server A for the three articles (three tables A_tbl1, A_tbl2, A_tbl3). Keep in mind that we will be setting up with push subscription to SQL Server B.\u003cbr /\u003e\n\u003cbr /\u003e\nThis example mainly use the SSMS GUI for all the replication tasks. So, ready for the screenshot?\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-IbOSUTz1JFs/VYsSuvAs8UI/AAAAAAAADpM/8QtWRMxhfFU/s1600/SqlReplicationA1.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"220\" src\u003d\"http://4.bp.blogspot.com/-IbOSUTz1JFs/VYsSuvAs8UI/AAAAAAAADpM/8QtWRMxhfFU/s320/SqlReplicationA1.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nDistributor hosted on the same server.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-Z7t_53x0gCI/VYsSvHJ0lbI/AAAAAAAADoE/pFK-2I1l5jg/s1600/SqlReplicationA2.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"217\" src\u003d\"http://1.bp.blogspot.com/-Z7t_53x0gCI/VYsSvHJ0lbI/AAAAAAAADoE/pFK-2I1l5jg/s400/SqlReplicationA2.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nThis is where we define the snapshot folder that we grant the sqlreplication Windows account the read write permission.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-DXt6EaFfBLw/VYsSvdciwSI/AAAAAAAADoI/szgMe-fQ5Co/s1600/SqlReplicationA3.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"353\" src\u003d\"http://3.bp.blogspot.com/-DXt6EaFfBLw/VYsSvdciwSI/AAAAAAAADoI/szgMe-fQ5Co/s400/SqlReplicationA3.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-Lgm6KcrMpcM/VYsSvlu-VbI/AAAAAAAADoM/mBfRBKGSpQw/s1600/SqlReplicationA4.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"142\" src\u003d\"http://3.bp.blogspot.com/-Lgm6KcrMpcM/VYsSvlu-VbI/AAAAAAAADoM/mBfRBKGSpQw/s320/SqlReplicationA4.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u0026nbsp;Use transactional replication\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-x8s27TNyaj0/VYsSv4kt02I/AAAAAAAADoQ/vfXnd_0YGSM/s1600/SqlReplicationA5.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"161\" src\u003d\"http://3.bp.blogspot.com/-x8s27TNyaj0/VYsSv4kt02I/AAAAAAAADoQ/vfXnd_0YGSM/s400/SqlReplicationA5.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nSelecting the A_tbl1, A_tbl2, and A_tbl3 for this publication\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-MaRK492Ms4E/VYsSwbbANPI/AAAAAAAADoU/93GDk0mhJmk/s1600/SqlReplicationA6.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"196\" src\u003d\"http://2.bp.blogspot.com/-MaRK492Ms4E/VYsSwbbANPI/AAAAAAAADoU/93GDk0mhJmk/s400/SqlReplicationA6.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nWe will initiate the snapshot after we finish all the configuration\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-n7Y4M0KvqVM/VYsUKuG1ChI/AAAAAAAADpY/bdCZNNAQ2EA/s1600/SqlReplicationA6-1.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"163\" src\u003d\"http://3.bp.blogspot.com/-n7Y4M0KvqVM/VYsUKuG1ChI/AAAAAAAADpY/bdCZNNAQ2EA/s400/SqlReplicationA6-1.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nUsing sqlreplication account on Server A for snapshot agent. We have previously configured the account with db_owner role in the TEST database.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-mGURjnY0f1g/VYsSwvGP4WI/AAAAAAAADoY/XaYpXbiBOYc/s1600/SqlReplicationA7.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"340\" src\u003d\"http://2.bp.blogspot.com/-mGURjnY0f1g/VYsSwvGP4WI/AAAAAAAADoY/XaYpXbiBOYc/s400/SqlReplicationA7.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nBoth snapshot and log reader agent using the same local Windows account.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-5ZvNB4M70Iw/VYsSw6RMbII/AAAAAAAADoc/DEPq6pG44PE/s1600/SqlReplicationA8.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"167\" src\u003d\"http://2.bp.blogspot.com/-5ZvNB4M70Iw/VYsSw6RMbII/AAAAAAAADoc/DEPq6pG44PE/s400/SqlReplicationA8.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/--UaKXxJ-SHE/VYsSxGuVehI/AAAAAAAADog/fKiaDlkJ_fE/s1600/SqlReplicationA9.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"164\" src\u003d\"http://1.bp.blogspot.com/--UaKXxJ-SHE/VYsSxGuVehI/AAAAAAAADog/fKiaDlkJ_fE/s320/SqlReplicationA9.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nName this publication as PUB_A\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-Shiy7DXBJrM/VYsSuvgTKvI/AAAAAAAADoA/BH-teQXxMw0/s1600/SqlReplicationA10.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"142\" src\u003d\"http://4.bp.blogspot.com/-Shiy7DXBJrM/VYsSuvgTKvI/AAAAAAAADoA/BH-teQXxMw0/s320/SqlReplicationA10.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-y95XMCdZTzY/VYsSulJSz3I/AAAAAAAADn8/GIyAdY07IDg/s1600/SqlReplicationA11.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"226\" src\u003d\"http://1.bp.blogspot.com/-y95XMCdZTzY/VYsSulJSz3I/AAAAAAAADn8/GIyAdY07IDg/s320/SqlReplicationA11.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nThere is one more thing to do here. The sqlreplication Windows account needs to be granted the db_owner role in the newly created distribution database.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-njcGbVKOYD8/VYsUtLMLGkI/AAAAAAAADpg/TfGlq5X9mAw/s1600/SqlReplicationDistRole.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"400\" src\u003d\"http://1.bp.blogspot.com/-njcGbVKOYD8/VYsUtLMLGkI/AAAAAAAADpg/TfGlq5X9mAw/s400/SqlReplicationDistRole.png\" width\u003d\"388\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nNow that the publication has been setup on SQL Server A, lets setup the subscriber (SQL Server B) for this publication. On SQL Server A, creates new subscription\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-4gKLTtryuiU/VYsWwB6pSMI/AAAAAAAADpw/pvoXLyf1WpI/s1600/SqlSubscriptionA1.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"307\" src\u003d\"http://1.bp.blogspot.com/-4gKLTtryuiU/VYsWwB6pSMI/AAAAAAAADpw/pvoXLyf1WpI/s320/SqlSubscriptionA1.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-6ywzHPgVuoA/VYsWwh874wI/AAAAAAAADp8/SiWw4Z_dH_I/s1600/SqlSubscriptionA2.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"196\" src\u003d\"http://3.bp.blogspot.com/-6ywzHPgVuoA/VYsWwh874wI/AAAAAAAADp8/SiWw4Z_dH_I/s320/SqlSubscriptionA2.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nRun all agents at distributor for push subscription.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-LvTAWzdO0MY/VYsWwzKsU2I/AAAAAAAADqA/tYLRbGfNd5o/s1600/SqlSubscriptionA3.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"255\" src\u003d\"http://1.bp.blogspot.com/-LvTAWzdO0MY/VYsWwzKsU2I/AAAAAAAADqA/tYLRbGfNd5o/s400/SqlSubscriptionA3.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nAdd SQL Server B as subscriber to this publication (PUB_A)\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-P90jySEnlq4/VYsWxOYCW0I/AAAAAAAADqM/TZQADEYTQbU/s1600/SqlSubscriptionA4.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"311\" src\u003d\"http://2.bp.blogspot.com/-P90jySEnlq4/VYsWxOYCW0I/AAAAAAAADqM/TZQADEYTQbU/s400/SqlSubscriptionA4.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-L7ReP1Jgghk/VYsWxswfEmI/AAAAAAAADqI/WFNyNyxaeiw/s1600/SqlSubscriptionA5.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"135\" src\u003d\"http://3.bp.blogspot.com/-L7ReP1Jgghk/VYsWxswfEmI/AAAAAAAADqI/WFNyNyxaeiw/s320/SqlSubscriptionA5.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-fjO_fjTqBUQ/VYsWyGkZdAI/AAAAAAAADqQ/iHh8nl8AesI/s1600/SqlSubscriptionA6.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"165\" src\u003d\"http://3.bp.blogspot.com/-fjO_fjTqBUQ/VYsWyGkZdAI/AAAAAAAADqQ/iHh8nl8AesI/s400/SqlSubscriptionA6.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nRun distribution agent with the sqlreplication local Windows account. Connect to subscriber (SQL Server B) with the Server B sqlreplication local Windows account. This will only work when both local Windows account has the same name and password.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-GA43sdNCIZk/VYsWyTTgk_I/AAAAAAAADqg/QwQzdiJKF2I/s1600/SqlSubscriptionA7.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"400\" src\u003d\"http://3.bp.blogspot.com/-GA43sdNCIZk/VYsWyTTgk_I/AAAAAAAADqg/QwQzdiJKF2I/s400/SqlSubscriptionA7.png\" width\u003d\"355\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-e8OiSSxLwOk/VYsWypl9POI/AAAAAAAADqY/dtTuCoQfMl8/s1600/SqlSubscriptionA8.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"155\" src\u003d\"http://3.bp.blogspot.com/-e8OiSSxLwOk/VYsWypl9POI/AAAAAAAADqY/dtTuCoQfMl8/s400/SqlSubscriptionA8.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nSynchronization (or replication) run continuously\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-rFWCVcQ2Q28/VYsWy2YWbAI/AAAAAAAADqk/G-VLqF9NQJI/s1600/SqlSubscriptionA9.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"135\" src\u003d\"http://1.bp.blogspot.com/-rFWCVcQ2Q28/VYsWy2YWbAI/AAAAAAAADqk/G-VLqF9NQJI/s400/SqlSubscriptionA9.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nSelect initialize when to immediately. This will start the Snapshot agent when the subscription is setup. The snapshot agent will generate the snapshot and synchronization will follow after.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003c/div\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-hVH_Y8zBIFg/VYsk3vg-MhI/AAAAAAAADs0/oQlNE6Ho7Y8/s1600/SqlSubscriptionA10-1.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"358\" src\u003d\"http://1.bp.blogspot.com/-hVH_Y8zBIFg/VYsk3vg-MhI/AAAAAAAADs0/oQlNE6Ho7Y8/s400/SqlSubscriptionA10-1.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-cQCXcFVSEE4/VYsWwCtW-3I/AAAAAAAADrE/292MSKZFh70/s1600/SqlSubscriptionA11.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"148\" src\u003d\"http://1.bp.blogspot.com/-cQCXcFVSEE4/VYsWwCtW-3I/AAAAAAAADrE/292MSKZFh70/s320/SqlSubscriptionA11.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003c/div\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-avzoOS0nlCw/VYsk3-Xf2JI/AAAAAAAADsw/lcAoYXxvb10/s1600/SqlSubscriptionA12-1.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"152\" src\u003d\"http://1.bp.blogspot.com/-avzoOS0nlCw/VYsk3-Xf2JI/AAAAAAAADsw/lcAoYXxvb10/s320/SqlSubscriptionA12-1.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nNow with both the publication and subscriber setup.\u003cbr /\u003e\n\u003cbr /\u003e\nGo to SQL Server B and verify the three A_tbl tables have updated with the data. Data show up. Great! Let's insert a record at SQL Server A.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eUSE TEST;\nINSERT INTO A_tbl2 VALUES ('Noon');\n\u003c/pre\u003e\n\u003cbr /\u003e\nCheck the synchronization status if the new record has sent to the subscriber.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-aRIAUWv5hbI/VYscMx0BBMI/AAAAAAAADrs/WJCJaVZ_DbM/s1600/SqlSyncA1.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"248\" src\u003d\"http://1.bp.blogspot.com/-aRIAUWv5hbI/VYscMx0BBMI/AAAAAAAADrs/WJCJaVZ_DbM/s320/SqlSyncA1.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\nSynchronization completed\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-ayQoGGJmokE/VYscMmM6LTI/AAAAAAAADrw/Vu1sTwWEKkU/s1600/SqlSyncA2.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"182\" src\u003d\"http://4.bp.blogspot.com/-ayQoGGJmokE/VYscMmM6LTI/AAAAAAAADrw/Vu1sTwWEKkU/s400/SqlSyncA2.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nGo to SQL Server B. Check the table A_tbl2, new record 'noon' show up.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-rC3LHZYHJMg/VYscM6pJ4oI/AAAAAAAADr0/AL5jg5lVUZQ/s1600/SqlSyncA3.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://3.bp.blogspot.com/-rC3LHZYHJMg/VYscM6pJ4oI/AAAAAAAADr0/AL5jg5lVUZQ/s1600/SqlSyncA3.png\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cb\u003eSQL Replication Implementation - Publication B\u003c/b\u003e\u003cbr /\u003e\nThe first publication (PUB_A) has been completed. Now we just need to setup the other publication (PUB_B) at the SQL Server B. The steps are similar like the one above.\u003cbr /\u003e\n\u003cbr /\u003e\nHere are some of the difference,\u003cbr /\u003e\n\u003cbr /\u003e\nSelect the B_tbl tables only for this publication\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-JlEzuDssH6o/VYseQ7vGNkI/AAAAAAAADsI/DB0P_1Yv4rA/s1600/SqlReplicationB6.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"220\" src\u003d\"http://2.bp.blogspot.com/-JlEzuDssH6o/VYseQ7vGNkI/AAAAAAAADsI/DB0P_1Yv4rA/s400/SqlReplicationB6.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nThe local Windows account used is the \u003cb\u003esqlreplication Windows on Server B (NOT Server A),\u003c/b\u003e although the name is the same. See the sql2008r2\u003cb\u003ea\u003c/b\u003e (with the a)?\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-4wk7PL_Eecw/VYseQzdBV_I/AAAAAAAADsM/58oG3cRx7FQ/s1600/SqlReplicationB7.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"343\" src\u003d\"http://3.bp.blogspot.com/-4wk7PL_Eecw/VYseQzdBV_I/AAAAAAAADsM/58oG3cRx7FQ/s400/SqlReplicationB7.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-BTvW8eMZ6IA/VYseRT6j-QI/AAAAAAAADsQ/MZ2hvMrrD3E/s1600/SqlReplicationB8.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"183\" src\u003d\"http://2.bp.blogspot.com/-BTvW8eMZ6IA/VYseRT6j-QI/AAAAAAAADsQ/MZ2hvMrrD3E/s400/SqlReplicationB8.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nThe publication named as PUB_B\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-jxmy-jLiNzY/VYseQ3O-65I/AAAAAAAADsY/sDAI-D7UHmA/s1600/SqlReplicationB10.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"157\" src\u003d\"http://2.bp.blogspot.com/-jxmy-jLiNzY/VYseQ3O-65I/AAAAAAAADsY/sDAI-D7UHmA/s320/SqlReplicationB10.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nOn subscriber setup, here are some of the difference.\u003cbr /\u003e\n\u003cbr /\u003e\nConnecting to SQL Server A to add it as subscriber\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-iGunn2KFerI/VYsmkNgvdzI/AAAAAAAADtE/aTWp88PBllA/s1600/SqlSubscriptionB5.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"137\" src\u003d\"http://1.bp.blogspot.com/-iGunn2KFerI/VYsmkNgvdzI/AAAAAAAADtE/aTWp88PBllA/s320/SqlSubscriptionB5.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-zR4jZo4Alpc/VYsmkN6NUPI/AAAAAAAADtc/xhLHQifcFfI/s1600/SqlSubscriptionB6.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"178\" src\u003d\"http://4.bp.blogspot.com/-zR4jZo4Alpc/VYsmkN6NUPI/AAAAAAAADtc/xhLHQifcFfI/s400/SqlSubscriptionB6.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nSimilarly, setup the Server B sqlreplication local Windows account (NOT the server A) as the process account.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-ZEK7aT_efmQ/VYsmkIy_ZjI/AAAAAAAADtI/aAiJJ1kF5yE/s1600/SqlSubscriptionB7.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"400\" src\u003d\"http://2.bp.blogspot.com/-ZEK7aT_efmQ/VYsmkIy_ZjI/AAAAAAAADtI/aAiJJ1kF5yE/s400/SqlSubscriptionB7.png\" width\u003d\"325\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-_jp27JYqSFc/VYsmkv9bIEI/AAAAAAAADtM/Drh-oUJba9U/s1600/SqlSubscriptionB8.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"142\" src\u003d\"http://3.bp.blogspot.com/-_jp27JYqSFc/VYsmkv9bIEI/AAAAAAAADtM/Drh-oUJba9U/s400/SqlSubscriptionB8.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nInsert a record to B_tbl1 on SQL Server B,\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eUSE TEST;\nINSERT INTO B_tbl1 VALUES ('MayBe');\n\u003c/pre\u003e\n\u003cbr /\u003e\nVerify the synchronization status,\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-4kw9WjX91jo/VYsnwzl41kI/AAAAAAAADtk/zDbemqg8O78/s1600/SqlSyncB2.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"181\" src\u003d\"http://2.bp.blogspot.com/-4kw9WjX91jo/VYsnwzl41kI/AAAAAAAADtk/zDbemqg8O78/s400/SqlSyncB2.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nGo to SQL Server A, check the B_tbl1 tables.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-PvaUGJQMg1s/VYsn7xwVx9I/AAAAAAAADts/FLJTC_bMroA/s1600/SqlSyncB3.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://1.bp.blogspot.com/-PvaUGJQMg1s/VYsn7xwVx9I/AAAAAAAADts/FLJTC_bMroA/s1600/SqlSyncB3.png\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nIt has been verified that two publications successfully replicated to each other. Mission accomplished."},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/2399833994167725913/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2015/06/sql-server-replication-case-study.html#comment-form","title":"0 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/2399833994167725913"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/2399833994167725913"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2015/06/sql-server-replication-case-study.html","title":"SQL Server Replication - Case Study \u0026 Implementation"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"media$thumbnail":{"xmlns$media":"http://search.yahoo.com/mrss/","url":"http://3.bp.blogspot.com/-wLrGChJqqqc/VYuS-pBuoDI/AAAAAAAADuw/gbDmfa5GYnQ/s72-c/UseCase.PNG","height":"72","width":"72"},"thr$total":{"$t":"0"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-849990751283414222"},"published":{"$t":"2015-06-19T07:00:00.003-05:00"},"updated":{"$t":"2022-08-15T23:45:18.679-05:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"PowerShell"},{"scheme":"http://www.blogger.com/atom/ns#","term":"Security"}],"title":{"type":"text","$t":"Powershell Password Encryption \u0026 Decryption"},"content":{"type":"html","$t":"\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-M_B1FMcb3Sw/VYOGpeMcaQI/AAAAAAAADmg/FW-laIt1rUU/s1600/Powershell_Encryption.jpg\" style\u003d\"clear: left; float: left; margin-bottom: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"133\" src\u003d\"https://2.bp.blogspot.com/-M_B1FMcb3Sw/VYOGpeMcaQI/AAAAAAAADmg/FW-laIt1rUU/s200/Powershell_Encryption.jpg\" width\u003d\"200\" /\u003e\u003c/a\u003e\u003c/div\u003e\nOne of the common task in PowerShell script design and execution is credential encryption requirement. Some privileged account is used and its credential need to pass to the script in order to access resources. It becomes crucial especially when the execution tasks are being delegated to other users or being automated. As storing the password as clear text is huge security risk and the last thing desired, here in this blog post we discuss a few options on storing the credential securely.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca name\u003d'more'\u003e\u003c/a\u003eIf account credential could be avoided being stored in the script, attempt that first. If the predefined / privileged account used is a Windows account (eg. local or domain account), and its process is executed through some task like Windows task scheduler (or PowerShell scheduled job), the privileged account could be setup as the running account of the task and let Windows handles the credential encryption securely during the setup. When the delegated user initiate the task on demand or the task is executed in a schedule automatically, the predefined account as the running account is used to access all the required resources with Windows integrated authentication.\u003cbr /\u003e\n\n\u003ca href\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEii1BMeoAi_RkDEg0nluC-eMSxdCCZZz065oOx406USA23mwt6_kDb3JUWC4oqGCNYJ7qKEi50HcsGon8nGklRwPE1f1au2O50qgfmopPVtKPMQPKnj7wFLkeu430tUcQbAIvJhWxZF_ezqFYm9K-tmwDblXfIwkuxC5c5HpMixpFjpqFaTIPjVleFjiA/s598/taskrunningaccount.png\" style\u003d\"display: block; padding: 1em 0; \"\u003e\u003cimg alt\u003d\"\" border\u003d\"0\" width\u003d\"400\" data-original-height\u003d\"174\" data-original-width\u003d\"598\" src\u003d\"https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEii1BMeoAi_RkDEg0nluC-eMSxdCCZZz065oOx406USA23mwt6_kDb3JUWC4oqGCNYJ7qKEi50HcsGon8nGklRwPE1f1au2O50qgfmopPVtKPMQPKnj7wFLkeu430tUcQbAIvJhWxZF_ezqFYm9K-tmwDblXfIwkuxC5c5HpMixpFjpqFaTIPjVleFjiA/s400/taskrunningaccount.png\"/\u003e\u003c/a\u003e\n\nOr assigning with PowerShell\n\n\u003cpre class\u003d\"brush:ps\"\u003e\n$principal \u003d New-ScheduledTaskPrincipal -UserId DomainA\\TestUser\nSet-ScheduledTask -TaskName TestingTask -Principal $principal\n\u003c/pre\u003e\n\nHowever often cases different Windows account(s) or third party software account(s) are used to access different type of resource within the execution. In these cases, it is likely that there will be a need to manage the account credential encryption ourselves.\u003cbr /\u003e\n\u003cbr /\u003e\nPowerShell provides some native command for encryption. The commonly use command is the ConvertTo-SecureString and ConvertFrom-SecureString.\u003cbr /\u003e\n\u003cbr /\u003e\nTake an example, this script is to obtain some server information of a remote server. However, the user who run the PowerShell script does not have the access to a remote server.\n\n\u003cpre class\u003d\"brush:ps\"\u003eGet-WmiObject -Class win32_OperatingSystem -ComputerName RemoteServerA\n\u003c/pre\u003e\n\nAs the executing account (user Windows account) doesn't have enough privilege, the access denied error is encountered\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cspan style\u003d\"color: red;\"\u003eGet-WmiObject : Access is denied. (Exception from HRESULT: 0x80070005 (E_ACCESSDENIED))\u003c/span\u003e\u003cbr /\u003e\n\u003cspan style\u003d\"color: red;\"\u003eAt line:1 char:2\u003c/span\u003e\u003cbr /\u003e\n\u003cspan style\u003d\"color: red;\"\u003e+ \u0026nbsp;Get-WmiObject -Class win32_OperatingSystem -ComputerName RemoteServerA\u003c/span\u003e\u003cbr /\u003e\n\u003cspan style\u003d\"color: red;\"\u003e+ \u0026nbsp;~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\u003c/span\u003e\u003cbr /\u003e\n\u003cspan style\u003d\"color: red;\"\u003e\u0026nbsp; \u0026nbsp; + CategoryInfo \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp;: NotSpecified: (:) [Get-WmiObject], UnauthorizedAccessException\u003c/span\u003e\u003cbr /\u003e\n\u003cspan style\u003d\"color: red;\"\u003e\u0026nbsp; \u0026nbsp; + FullyQualifiedErrorId : System.UnauthorizedAccessException,Microsoft.PowerShell.Commands.GetWmiObjectCommand\u003c/span\u003e\u003cbr /\u003e\n\u003cbr /\u003e\nTo incorporate a predefined Windows account which has been granted permission to the server to obtain this information,\n\n\u003cpre class\u003d\"brush:ps\"\u003e$User \u003d 'TestUser'\n$Password \u003d 'B@dPassw0rd!'\n$SecurePassword \u003d\u0026nbsp;$Password | ConvertTo-SecureString -AsPlainText -Force\n$UserCred \u003d New-Object System.Management.Automation.PSCredential ($User, $SecurePassword)\nGet-WmiObject -Class win32_OperatingSystem -ComputerName RemoteServerA -Credential $UserCred\n\u003c/pre\u003e\n\nWith the password in plain text, we would need to encrypt the password here.\u003cbr /\u003e\n\u003cbr /\u003e\nThe ConvertTo-SecureString cmdlet converts the password into a System.Security.SecureString object. This object represents text that should be kept confidential, such as removing it from the computer memory when no longer needed. \n\u003cbr /\u003e\u003cbr /\u003e\nTo pre-encrypt the password, we could use the same command and hard code the encrypted password in the script, save it in another file, in the registry or some other places.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eStore encrypted password in a file (txt file)\u003c/b\u003e\u003cbr /\u003e\nTo save the encrypted password text into a file,\n\n\u003cpre class\u003d\"brush:ps\"\u003e$User \u003d 'TestUser'\n$Password \u003d 'B@dPassw0rd!'\u003cbr /\u003e$Password | ConvertTo-SecureString -AsPlainText -Force | ConvertFrom-SecureString | Out-File C:\\encrypted.txt\n\u003c/pre\u003e\n\nNote: The Force flag is optional starting in PowerShell 7.\n\u003cbr /\u003e\u003cbr /\u003e\nConvertFrom-SecureString cmdlet convert a secure string (System.Security.SecureString) to an encrypted standard string using the Windows Data Protection API (DPAPI) using user account as default (we will discuss some issues with this default setting later in this post) OR with an encryption key if provided. The encrypted password text (eg.\u0026nbsp;\u0026nbsp;'01000000d08c9ddf0115d1118c7a00c04fc29...') is then save into a file.\u003cbr /\u003e\n\u003cbr /\u003e\nA better approach is not displaying the password in clear text at all,\n\n\u003cpre class\u003d\"brush:ps\"\u003e$UserCred \u003d Get-Credential\n$UserCred.Password | ConvertFrom-SecureString | Out-File C:\\encrypted.txt\n\u003c/pre\u003e\n\nThe Get-Credential prompt for user name and password as directly save into a secure PSCredential object.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eDecrypt encrypted password in a file (txt file)\u003c/b\u003e\u003cbr /\u003e\nWith the password encrypted as stored in the file, now the script simply have to extract the encrypted password and pass it to the PSCredential object,\n\n\u003cpre class\u003d\"brush:ps\"\u003e\n$User \u003d 'TestUser'\n$SecurePassword \u003d Get-Content C:\\encrypted.txt | ConvertTo-SecureString\n$UserCred \u003d New-Object System.Management.Automation.PSCredential ($User, $SecurePassword)\nGet-WmiObject -Class win32_OperatingSystem -ComputerName RemoteServerA -Credential $UserCred\n\u003c/pre\u003e\n\n\u003cb\u003eStore encrypted password in a file using Export-Clixml (xml file)\u003c/b\u003e\u003cbr /\u003e\nAnother option is to save it as xml file. We could save the PSCredential object with both user and encrypted password to a xml file.\n\n\u003cpre class\u003d\"brush:ps\"\u003eGet-Credential | Export-Clixml -Path C:\\encrypted.xml\n\u003c/pre\u003e\n\n\u003cb\u003eDecrypt encrypted password in a file using Import-Clixml (xml file)\u003c/b\u003e\u003cbr /\u003e\nTo load the xml directly back into a PSCredential object\u003cbr /\u003e\n\n\u003cpre class\u003d\"brush:ps\"\u003e\n$UserCred \u003d Import-Clixml -Path C:\\encrypted.xml\nGet-WmiObject -Class win32_OperatingSystem -ComputerName RemoteServerA -Credential $UserCred\n\u003c/pre\u003e\n\nNote that the ConvertTo-SecureString converts the encrypted password text into a SecureString object. A lot of (but not all) Windows and 3rd party PowerShell cmdlet utilize the SecureString object for authentication purpose like PSCredential object or Compellent Get-SCConnection. In general, the password does not need to be decrypted back to plain text as string object as it is exposed in the memory until it is removed by the garbage collector.\u003cbr /\u003e\n\u003cbr /\u003e\nHowever, some cmdlet does not utilize the SecureString and require password in plain text. The SecureString object in this case will need to be decrypted into to plain text. If the account information has been constructed into a PSCredential object, the password could be extracted in plain text,\n\n\u003cpre class\u003d\"brush:ps\"\u003e$Password \u003d $UserCred.GetNetworkCredential().Password\n\u003c/pre\u003e\n\nOR starting in PowerShell 7\n\n\u003cpre class\u003d\"brush:ps\"\u003e\n$Password \u003d $UserCred.Password | ConvertFrom-SecureString -AsPlainText\n\u003c/pre\u003e\n\n\u003cb\u003eStore encrypted password in a file for 3rd party password (txt file)\u003c/b\u003e\u003cbr /\u003e\nIn the case of encrypting a third party account password,\u003cbr /\u003e\n\n\u003cpre class\u003d\"brush:ps\"\u003e\n$SecurePassword \u003d Read-Host -AsSecureString\n$SecurePassword | ConvertFrom-SecureString | Out-File C:\\encrypted.txt\n\u003c/pre\u003e\n\n\u003cb\u003eDecrypt encrypted password in a file for 3rd party password (txt file)\u003c/b\u003e\u003cbr /\u003e\nDecrypt 3rd party password into plain text by converting the SecureString into Binary String object,\u003cbr /\u003e\n\n\u003cpre class\u003d\"brush:ps\"\u003e\n$User \u003d 'TestUser'\n$SecurePassword \u003d Get-Content C:\\encrypted.txt | ConvertTo-SecureString\n$Marshal \u003d [System.Runtime.InteropServices.Marshal]\n$Bstr \u003d $Marshal::SecureStringToBSTR($SecurePassword)\n$Password \u003d $Marshal::PtrToStringAuto($Bstr)\n$Marshal::ZeroFreeBSTR($Bstr)\n\u003c/pre\u003e\n\nThe last line ZeroFreeBSTR is to clear the unmanaged memory.\n\nOR starting in PowerShell 7\n\n\u003cpre class\u003d\"brush:ps\"\u003e$User \u003d 'TestUser'\n$SecurePassword \u003d Get-Content C:\\encrypted.txt | ConvertTo-SecureString\n$Password \u003d $SecurePassword | ConvertFrom-SecureString -AsPlainText\n\u003c/pre\u003e\n\n\u003cb\u003eStore encrypted password in registry\u003c/b\u003e\u003cbr /\u003e\nAnother method of saving encrypted password is to save it in registry. The example below uses HKCU (HKEY_CURRENT_USER) hive which the registry only applicable to the current user. You may want to use other more appropriate hives (eg. HKLM) for all other users access.\n\n\u003cpre class\u003d\"brush:ps\"\u003e\n$SecurePasswordText \u003d 'B@dPassw0rd!' | ConvertTo-SecureString -AsPlainText -Force | ConvertFrom-SecureString\nNew-Item -Path HKCU:\\Software\\Test -Value $SecurePasswordText\n\u003c/pre\u003e\n\nOR\n\n\u003cpre class\u003d\"brush:ps\"\u003e$UserCred \u003d Get-Credential\nNew-Item -Path HKCU:\\Software\\Test -Value ($UserCred.Password | ConvertFrom-SecureString)\n\u003c/pre\u003e\n\nOR storing both user name and password in the registry,\u003cbr /\u003e\n\n\u003cpre class\u003d\"brush:ps\"\u003e\nNew-Item -Path HKCU:\\Software\\Test\nNew-ItemProperty -Path HKCU:\\Software\\Test -Name User -Value ($UserCred.UserName)\nNew-ItemProperty -Path HKCU:\\Software\\Test -Name Password -Value ($UserCred.Password | ConvertFrom-SecureString)\n\u003c/pre\u003e\n\n\u003cb\u003eTo decrypt encrypted password stored in registry\u003c/b\u003e\u003cbr /\u003e\nWith the password encrypted as stored in the registry, here is how to extract it for PSCredential in the script,\u003cbr /\u003e\n\n\u003cpre class\u003d\"brush:ps\"\u003e\n$User \u003d 'TestUser'\n$SecurePassword \u003d (Get-ItemProperty -Path HKCU:\\Software\\Test).'(Default)' | ConvertTo-SecureString\n$UserCred \u003d New-Object System.Management.Automation.PSCredential ($User, $SecurePassword)\nGet-WmiObject -Class win32_OperatingSystem -ComputerName RemoteServerA -Credential $UserCred\n\u003c/pre\u003e\n\nOR\n\n\u003cpre class\u003d\"brush:ps\"\u003e$User \u003d (Get-ItemProperty -Path HKCU:\\Software\\Test).User\n$SecurePassword \u003d (Get-ItemProperty -Path HKCU:\\Software\\Test).Password | ConvertTo-SecureString\n$UserCred \u003d New-Object System.Management.Automation.PSCredential ($User, $SecurePassword)\nGet-WmiObject -Class win32_OperatingSystem -ComputerName RemoteServerA -Credential $UserCred\n\u003c/pre\u003e\n\n\u003cb\u003eIssue with default encryption using user account\u003c/b\u003e\u003cbr /\u003e\nBy default, ConvertTo-SecureString cmdlet uses current user's password to generate an encryption key, and it is stored within the user profile (eg. %Userprofile%\\Application Data\\Microsoft\\Crypto\\RSA\\User SID for RSA key). The encryption key is then used to encrypt the intended string. The same user's user profile is created independently on different computer. Unless the particular person's user account has been set as roaming profile, the encryption key on his user profile on one computer does not synchronize with his user profile on another computer. This creates some issues. If a person pre-encrypt the password on his computer, then deploy that to another server. First, the encrypted password text can't be decrypted because the encryption key is not present on the person' user profile of the server. Needless to say, other users won't be able to decrypt the password as well because they don't have the encryption key. This is the error received.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cspan style\u003d\"color: red;\"\u003eConvertTo-SecureString : Key not valid for use in specified state.\u003c/span\u003e\u003cbr /\u003e\n\u003cspan style\u003d\"color: red;\"\u003eAt line:1 char:336\u003c/span\u003e\u003cbr /\u003e\n\u003cspan style\u003d\"color: red;\"\u003e+ ... bf6ff4d7ae3\" | ConvertTo-SecureString\u003c/span\u003e\u003cbr /\u003e\n\u003cspan style\u003d\"color: red;\"\u003e+ \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp;~~~~~~~~~~~~~~~~~~~~~~\u003c/span\u003e\u003cbr /\u003e\n\u003cspan style\u003d\"color: red;\"\u003e\u0026nbsp; \u0026nbsp; + CategoryInfo \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp;: InvalidArgument: (:) [ConvertTo-SecureString], CryptographicException\u003c/span\u003e\u003cbr /\u003e\n\u003cspan style\u003d\"color: red;\"\u003e\u0026nbsp; \u0026nbsp; + FullyQualifiedErrorId : ImportSecureString_InvalidArgument_CryptographicError,Microsoft.PowerShell.Commands.Conv\u003c/span\u003e\u003cbr /\u003e\n\u003cspan style\u003d\"color: red;\"\u003e\u0026nbsp; \u0026nbsp;ertToSecureStringCommand\u003c/span\u003e\u003cbr /\u003e\n\u003cbr /\u003e\nIf it is only to resolve the issue locally on one server, a task could be setup in a way that always use the same running account which perform execute the script. This way, we could encrypt the predefined account password using the running account. As the running account has the encryption key stored in its user profile, it is able to decrypt the password during task execution regardless who initiated the task. However, that also means a person will be the developer of the script, the admin of the server, and have access to the running account credential.\u003cbr /\u003e\n\u003cbr /\u003e\nTo perform encryption with a running account, we could login to the server as the running account, or use the RunAs command in our own remote session to open a PowerShell session as the running account to perform the encryption,\n\n\u003cpre class\u003d\"brush:ps\"\u003erunas /profile /user:RunningAcctA powershell.exe\n\u003c/pre\u003e\n\nNote that running account profile needs to be loaded in order to store the encryption key as discussed earlier. /Profile is a default parameter. The /profile parameter is listed as an explicit example.\u003cbr /\u003e\n\u003cbr /\u003e\nIn the PowerShell session. use whoami to identify the account of that session. Once verified, generate the encrypted password file from that session.\u003cbr /\u003e\n\u003cbr /\u003e\nThis trick applies to using LocalSystem account as well. The easiest way to run a PowerShell session as LocalSystem is using\u0026nbsp;PsExec\u0026nbsp;by\u0026nbsp;Mark Russinovich from Microsoft. You can find this download from this Windows internal\u0026nbsp;\u003ca href\u003d\"https://technet.microsoft.com/en-us/sysinternals/bb897553.aspx\" target\u003d\"_blank\"\u003elink\u003c/a\u003e\u0026nbsp;or I have some PsExec example here in this\u0026nbsp;\u003ca href\u003d\"http://www.travisgan.com/2013/06/sql-server-locked-out-sysadmin-access.html\" target\u003d\"_blank\"\u003epost\u003c/a\u003e\u0026nbsp;as well toward the middle section.\n\n\u003cpre class\u003d\"brush:ps\"\u003epsexec -s -i powershell.exe\n\u003c/pre\u003e\n\nKeep in mind that if the script is migrated to other server, the encryption step need to be re-performed again as the previous encryption key generated only store in the running account profile of that particular server.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eCustom Encryption Key\u003c/b\u003e\u003cbr /\u003e\nOne other way to address the multiple server and different user issue is to use a specific encryption key.\u0026nbsp;ConvertTo-SecureString cmdlet allows a key to be provided for the encryption. The valid encryption key lengths are 16, 24, and 32 bytes. With the use of encryption key, it allows the encrypted password to be decrypted on different server with different account.\u003cbr /\u003e\n\u003cbr /\u003e\nTo generate a valid key, we could use RNGCryptoServiceProvider class to generate random number for the key.\u003cbr /\u003e\n\n\u003cpre class\u003d\"brush:ps\"\u003e\n$EncryptKey \u003d New-Object Byte[] 16 \u0026nbsp;#An example of 16 bytes key\n[Security.Cryptography.RNGCryptoServiceProvider]::Create().GetBytes($EncryptKey)\n\u003c/pre\u003e\n\nIn this example, a simple 1 to 16 array is used as the key.\n\n\u003cpre class\u003d\"brush:ps\"\u003e\n[byte[]] $EncryptKey \u003d (1..16) \u0026nbsp; #An example of a simple key (1. 2, 3,...,14, 15, 16)\n$UserCred \u003d Get-Credential\n$UserCred.Password | ConvertFrom-SecureString -Key $EncryptedKey | Out-File C:\\encrypted.txt\n\u003c/pre\u003e\n\nIn the script,\n\n\u003cpre class\u003d\"brush:ps\"\u003e\n[byte[]] $EncryptedKey \u003d (1..16)\n$User \u003d 'TestUser'\n$SecurePassword \u003d Get-Content C:\\encrypted.txt | ConvertTo-SecureString -Key $EncryptedKey\n$UserCred \u003d New-Object System.Management.Automation.PSCredential ($User, $SecurePassword)\nGet-WmiObject -Class win32_OperatingSystem -ComputerName RemoteServerA -Credential $UserCred\n\u003c/pre\u003e\n\nHowever, by hard cording the encryption key in the script, it exposes the key and allows unintended person to potentially decrypt the encrypted password. There are a few options to manage the key.\u003cbr /\u003e\n\u003cbr /\u003e\nStore the key in a file with access privilege granted only to intended user or executing/service account, this should be done with the encrypted password file as well. For example,\u003cbr /\u003e\n\n\u003cpre class\u003d\"brush:ps\"\u003e\n$User \u003d 'TestUser'\n$SecureKey \u003d Get-Content C:\\Key.txt | ConvertTo-SecureString\n$SecurePassword \u003d Get-Content C:\\encrypted.txt | ConvertTo-SecureString -SecureKey $SecureKey\n$UserCred \u003d New-Object System.Management.Automation.PSCredential ($User, $SecurePassword)\nGet-WmiObject -Class win32_OperatingSystem -ComputerName RemoteServerA -Credential $UserCred\n\u003c/pre\u003e\n\nJust a reminder that if the process is going to be executed by a delegated user, the user account will need the read access to the encrypted password and key file. This method prevent unintended person from obtaining the key. However, if the intent is to prevent the delegated user to obtain the password, this method would not be sufficient. One way to address this is setup a task / service to execute the script with a service account (executing account) and only that account has the read access (through Access Control List, ACL) to the encrypted password and key file. When the delegated user initiates the task / service, the process utilizes the service (executing account) to access the files. This limits the exposure of the encrypted password and key file only to the service account.\u003cbr /\u003e\n\u003cbr /\u003e\nThere are other option like using a certificate to encrypt the key file. Dave Wyatt has a good post of this \u003ca href\u003d\"http://powershell.org/wp/2014/02/01/revisited-powershell-and-encryption/\" target\u003d\"_blank\"\u003ehere\u003c/a\u003e. The way it works is the user needs to have a private key of the certificate in order to decrypt the encryption key. This option faces the similar concern like the key file.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eUsing DPAPI\u0026nbsp;\u003c/b\u003e\u003cb\u003eProtectedData Class for encryption\u003c/b\u003e\u003cbr /\u003e\nDPAPI ProtectedData class provides another method to encrypt and decrypt data. As we discussed earlier that ConvertTo-SecureString uses user account by default, this Protected class provide encryption options as CurrentUser or LocalMachine (LocalSystem profile). For scenario like having delegated users to run the script on the server, the predefined account password could be encrypted with LocalMachine option (scope) and any user could decrypt the password on that machine.\n\n\u003cpre class\u003d\"brush:ps\"\u003e\n$Password \u003d \"B@dPassw0rd!\"\u003cbr /\u003e$PasswordBytes \u003d [System.Text.Encoding]::Unicode.GetBytes($Password)\n$SecurePassword \u003d [Security.Cryptography.ProtectedData]::Protect($PasswordBytes, $null, [Security.Cryptography.DataProtectionScope]::LocalMachine)\n$SecurePasswordStr \u003d [System.Convert]::ToBase64String($SecurePassword)\n\u003c/pre\u003e\nTo decrypt the encrypted password,\u003cbr /\u003e\n\u003cpre class\u003d\"brush:ps\"\u003e\n#$SecurePasswordStr\n$SecureStr \u003d [System.Convert]::FromBase64String($SecurePasswordStr)\n$StringBytes \u003d [Security.Cryptography.ProtectedData]::Unprotect($SecureStr, $null, [Security.Cryptography.DataProtectionScope]::LocalMachine)\n$PasswordStr \u003d [System.Text.Encoding]::Unicode.GetString($StringBytes)\n\u003c/pre\u003e\n\nNote that ProtectedData class is returning byte array object (and later convert to string) as opposed to SecureString object. If other cmdlet need the SecureString as its parameter, the password will need to be converted to the SecureString object.\n\u003cbr /\u003e\u003cbr /\u003e\n\u003cb\u003eUse SecretManagement and SecretStore (or other 3rd party extension vault) PowerShell module\u003c/b\u003e\u003cbr /\u003e\nMicrosoft released these two new modules in 2021. SecretManagement module help user to manage secrets that are stored across vaults (local or remote). SecretStore module is a cross-platform local extension vault supported in all environment as PowerShell 7. The SecretStore stores the secrets locally for the current user and uses .NET Core cryptographic APIs to encrypt file content. Even without vault password option, it still encrypt the secrets but the key is stored in current user location.\n\u003cbr/\u003e\u003cBr/\u003e\nThe modules required to be installed first and some configuration.\n\n\u003cpre class\u003d\"brush:ps\"\u003e\nInstall-Module Microsoft.PowerShell.SecretManagement, Microsoft.PowerShell.SecretStore -Scope CurrentUser\nRegister-SecretVault -Name SecretStore -ModuleName Microsoft.PowerShell.SecretStore -DefaultVault\n\n# Setting Secret Store Password. It prompts to enter the vault password since not provided\nSet-SecretStorePassword\n\u003c/pre\u003e\n\nSetting up a secret\n\n\u003cpre class\u003d\"brush:ps\"\u003e\n# Setting a secret. If not provided, it will promot to enter in secure format\nSet-Secret -Name TestUserPassword -Secret 'B@dPassw0rd!'\n\u003c/pre\u003e\n\nTo retrieve the password, first to unlock the vault. The vault stay unlocked within the PasswordTimeOut setting (default 15 minutes).\n\n\u003cpre class\u003d\"brush:ps\"\u003e\n# Unlock the vault with the vault password. Provide valut password as SecureString. \n# If not provided, it prompts to enter vault password\nUnlock-SecretStore\n\nGet-Secret -Name TestUserPassword -AsPlainText\n\u003c/pre\u003e\nThere are so much more to these SecretManagement and SecretStore (and also 3rd party vault) modules. More details in another post.\n"},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/849990751283414222/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2015/06/powershell-password-encryption.html#comment-form","title":"0 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/849990751283414222"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/849990751283414222"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2015/06/powershell-password-encryption.html","title":"Powershell Password Encryption \u0026 Decryption"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"media$thumbnail":{"xmlns$media":"http://search.yahoo.com/mrss/","url":"https://2.bp.blogspot.com/-M_B1FMcb3Sw/VYOGpeMcaQI/AAAAAAAADmg/FW-laIt1rUU/s72-c/Powershell_Encryption.jpg","height":"72","width":"72"},"thr$total":{"$t":"0"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-85337532743026677"},"published":{"$t":"2015-05-05T05:00:00.000-05:00"},"updated":{"$t":"2015-05-13T13:56:17.923-05:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"Performance Tuning"}],"title":{"type":"text","$t":"Query Hint and Plan Guide"},"content":{"type":"html","$t":"\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003c/div\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003c/div\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-Lu3yoXoP5Fg/VUhHnuKacMI/AAAAAAAADi0/54aR8Hb5TXk/s1600/guide.jpg\" imageanchor\u003d\"1\" style\u003d\"clear: left; float: left; margin-bottom: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"133\" src\u003d\"http://4.bp.blogspot.com/-Lu3yoXoP5Fg/VUhHnuKacMI/AAAAAAAADi0/54aR8Hb5TXk/s200/guide.jpg\" width\u003d\"200\" /\u003e\u003c/a\u003e\u003c/div\u003e\nQuery execution plan generated for a SQL query significantly affect the query performance. Parameter sniffing, indexes, statistics and other factors are taken into consideration for SQL Server query optimizer to produce an optimal query plan for the execution wit\u003cspan id\u003d\"goog_1777043371\"\u003e\u003c/span\u003e\u003cspan id\u003d\"goog_1777043372\"\u003e\u003c/span\u003ehin a timely manner. Sometimes, due to certain scenarios and limitation, the query plan generated may be suboptimal. There are multiple ways to address this problem, like modifying the way the query is written, or using query hint or even plan guides under certain circumstances. This blog post discuss an example of using a query hint as well as plan guide.\n\n\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca name\u003d'more'\u003e\u003c/a\u003eFirst, this is a contrived stored procedure to retrieve some sales and product information based on the user input of the product number. It has a LIKE operator in the where condition to allows a specific product number or wildcard value.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eCREATE PROCEDURE sp_ProductOrderDetail \n(\n  @ProductNumber nvarchar(25),\n  @RecodOffset int,\n  @RecordNumber int\n)\nAS\nBEGIN\nSELECT s.SalesOrderNumber, p.Name, d.OrderQty, d.LineTotal\nFROM sales.SalesOrderHeaderEnlarged s\nJOIN sales.SalesOrderDetailEnlarged d\n ON d.SalesOrderID \u003d s.SalesOrderID\nJOIN Production.Product p\n ON p.ProductID \u003d d.ProductID\nWHERE p.ProductNumber LIKE @ProductNumber\nORDER BY s.SalesOrderNumber \nOFFSET @RecodOffset ROWS\nFETCH NEXT @RecordNumber ROWS ONLY;\nEND\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\nTurn on IO and Time statistics\n\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eSET STATISTICS IO ON;\nSET STATISTICS TIME ON;\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\nThe performance comparison below is tested against warm cache where the data already loaded in memory. \n\nExecute first query to return result from the first 1000 records of all the frame (FR% as ProductNumber). Before the execution, we clear the query plan cache (not the data in memory) for better comparison later on.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eDBCC FREEPROCCACHE; --Remove the query plan cache (not the data)\nGO\n\nEXEC sp_ProductOrderDetail @ProductNumber \u003d 'FR%', @RecodOffset \u003d 0, @RecordNumber \u003d 1000;\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-3zdihDmcpW0/VUfdDykyZOI/AAAAAAAADgE/BTwY6w_9zwE/s1600/query1-stat.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"200\" src\u003d\"http://3.bp.blogspot.com/-3zdihDmcpW0/VUfdDykyZOI/AAAAAAAADgE/BTwY6w_9zwE/s640/query1-stat.png\" width\u003d\"500\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-lx1yDfYOMg0/VUfdCHWED4I/AAAAAAAADfs/0J6XyoBI8Pw/s1600/query1-plan.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"112\" src\u003d\"http://4.bp.blogspot.com/-lx1yDfYOMg0/VUfdCHWED4I/AAAAAAAADfs/0J6XyoBI8Pw/s400/query1-plan.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nThe query executed less than 1 second.\n\nThe query is written to also allow specific product number (SO-B909-L). Let's remove the previous query plan cache (not the data in memory).\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eDBCC FREEPROCCACHE; --Remove the query plan cache (not the data)\nGO\n\nEXEC sp_ProductOrderDetail @ProductNumber \u003d 'SO-B909-L', @RecodOffset \u003d 0, @RecordNumber \u003d 1000;\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-hG9AmA3kECs/VUfdFasyrOI/AAAAAAAADgg/_K1XCbbwUjE/s1600/query2-stat.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"200\" src\u003d\"http://1.bp.blogspot.com/-hG9AmA3kECs/VUfdFasyrOI/AAAAAAAADgg/_K1XCbbwUjE/s640/query2-stat.png\" width\u003d\"500\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-j3wG8H5QOVk/VUfdE4x2kjI/AAAAAAAADgU/xiyNMQykRZU/s1600/query2-plan.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"200\" src\u003d\"http://3.bp.blogspot.com/-j3wG8H5QOVk/VUfdE4x2kjI/AAAAAAAADgU/xiyNMQykRZU/s640/query2-plan.png\" width\u003d\"500\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cb\u003e\u003cbr /\u003e\u003c/b\u003e\nNotice that the execution plan is different from the previous query plan? SQL Server optimizer comes up with this query plan with the parameter value sniffed (SO-B909-L). Instead of scanning the respective sales header and sales details records and join with respective product, it identifies the product record first and join it with respective sales details and sales header. This may due to the records in scope is much smaller since the exact product number is provided.\n\u003cbr /\u003e\n\u003cbr /\u003e\nAs SQL Server reuse previously cached query plan, executing the first parameter with FR% yield this performance,\n\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eEXEC sp_ProductOrderDetail @ProductNumber \u003d 'SO-B909-L', @RecodOffset \u003d 0, @RecordNumber \u003d 1000;\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-Qb-Hcuv7p8k/VUfdEPqFWeI/AAAAAAAADgM/U36GKXG2R5o/s1600/query1-stat2.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"200\" src\u003d\"http://4.bp.blogspot.com/-Qb-Hcuv7p8k/VUfdEPqFWeI/AAAAAAAADgM/U36GKXG2R5o/s640/query1-stat2.png\" width\u003d\"500\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-VuxkOPs3lTA/VUfdDlKtR_I/AAAAAAAADgY/ArfSfZQw1dk/s1600/query1-plan2.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"200\" src\u003d\"http://2.bp.blogspot.com/-VuxkOPs3lTA/VUfdDlKtR_I/AAAAAAAADgY/ArfSfZQw1dk/s640/query1-plan2.png\" width\u003d\"500\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nThe performance of the query went from less than 1 second to about 25 seconds! First, there is significant IO involved with this query plan for this parameter value, and also multiple of operation spilled into tempdb which makes the query run much slower.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-H6ObHPW9IDM/VUfhcEeR4UI/AAAAAAAADg4/EkIiws_9JLk/s1600/query1-plan2-spill-operator2.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"492\" src\u003d\"http://3.bp.blogspot.com/-H6ObHPW9IDM/VUfhcEeR4UI/AAAAAAAADg4/EkIiws_9JLk/s640/query1-plan2-spill-operator2.png\" width\u003d\"500\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003c/div\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003c/div\u003e\n\u003cbr /\u003e\nThe query was designed to allow parameter value that have very different scope. One for specific value that usually yield small result set, but another for pattern with a wildcard that usually return much larger result set.\u003cbr /\u003e\n\u003cbr /\u003e\nThere could be multiple options to address this. Often, the stored procedure may be rewritten in a way to allow different query plan to be used according to parameter value. Since this post is about query hint and plan guide, let's explore those method. \n\nOne of the query hint is OPTIMIZE FOR UNKNOWN. OPTIMIZE FOR UNKNOWN hint asks SQL Server optimizer to use statistical data instead of the initial parameter values (sniffed value) when the query is compiled and optimized. It produces a consistent query plan regardless of what the initial value was used (sniffed).\u003cbr /\u003e\n\u003cbr /\u003e\nSince the parameter value provided yield a different query plan as noticed previously, could the OPTIMIZE FOR UNKNOWN query hint help here?\n\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eALTER PROCEDURE sp_ProductOrderDetail \n(\n  @ProductNumber nvarchar(25),\n  @RecodOffset int,\n  @RecordNumber int\n)\nAS\nBEGIN\nSELECT s.SalesOrderNumber, p.Name, d.OrderQty, d.LineTotal\nFROM sales.SalesOrderHeaderEnlarged s\nJOIN sales.SalesOrderDetailEnlarged d\n ON d.SalesOrderID \u003d s.SalesOrderID\nJOIN Production.Product p\n ON p.ProductID \u003d d.ProductID\nWHERE p.ProductNumber LIKE @ProductNumber\nORDER BY s.SalesOrderNumber \nOFFSET @RecodOffset ROWS\nFETCH NEXT @RecordNumber ROWS ONLY\nOPTION (OPTIMIZE FOR UNKNOWN);\nEND\n\u003c/pre\u003e\n\u003cbr /\u003e\nNow, execute the stored procedure with either @ProductNumber \u003d 'SO-B909-L' or @ProductNumber \u003d 'FR%' produces the same query plan.\n\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eDBCC FREEPROCCACHE; \nGO\nEXEC sp_ProductOrderDetail @ProductNumber \u003d 'SO-B909-L', @RecodOffset \u003d 0, @RecordNumber \u003d 1000;\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/--yekoG7jees/VUfiKSpcRxI/AAAAAAAADhA/_sbTKZvp0Lo/s1600/query1-plan3.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"112\" src\u003d\"http://4.bp.blogspot.com/--yekoG7jees/VUfiKSpcRxI/AAAAAAAADhA/_sbTKZvp0Lo/s400/query1-plan3.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eDBCC FREEPROCCACHE; \nGO\nEXEC sp_ProductOrderDetail @ProductNumber \u003d 'FR%', @RecodOffset \u003d 0, @RecordNumber \u003d 1000;\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-x7GRkkDbNFc/VUfiK365TaI/AAAAAAAADhE/FAjKPiXl9ec/s1600/query2-plan3.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"109\" src\u003d\"http://1.bp.blogspot.com/-x7GRkkDbNFc/VUfiK365TaI/AAAAAAAADhE/FAjKPiXl9ec/s400/query2-plan3.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-84-QXWUVyAE/VUfjn6D_Z4I/AAAAAAAADhU/BbtRIYHdsdg/s1600/query2-stat3.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"200\" src\u003d\"http://4.bp.blogspot.com/-84-QXWUVyAE/VUfjn6D_Z4I/AAAAAAAADhU/BbtRIYHdsdg/s640/query2-stat3.png\" width\u003d\"500\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nIt looks good on @ProductNumer \u003d 'FR%' performance, but the performance on @ProductNumber \u003d 'SO-B909-L' suffers with this plan. It went from about 1 second to around 20 seconds! One scenario becomes better in the expense of the other one. Not good.\u003cbr /\u003e\n\u003cbr /\u003e\nAs we have noticed previously that these two types of parameter value passed yield very different query plans to obtain the most optimal performance, it seems like it is best not to use the query plan cached from the initial parameter value, but rather utilize an optimal plan depending on the provided parameter values.\u003cbr /\u003e\n\u003cbr /\u003e\nOPTION (RECOMPILE) is a query hint to discard query plan previously generated and force the query optimizer to recompile a new query plan the next time the same query is executed.\n\nLet's see how it works here.\n\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eALTER PROCEDURE sp_ProductOrderDetail \n(\n  @ProductNumber nvarchar(25),\n  @RecodOffset int,\n  @RecordNumber int\n)\nAS\nBEGIN\nSELECT s.SalesOrderNumber, p.Name, d.OrderQty, d.LineTotal\nFROM sales.SalesOrderHeaderEnlarged s\nJOIN sales.SalesOrderDetailEnlarged d\n ON d.SalesOrderID \u003d s.SalesOrderID\nJOIN Production.Product p\n ON p.ProductID \u003d d.ProductID\nWHERE p.ProductNumber LIKE @ProductNumber\nORDER BY s.SalesOrderNumber \nOFFSET @RecodOffset ROWS\nFETCH NEXT @RecordNumber ROWS ONLY\nOPTION (RECOMPILE);\nEND\n\u003c/pre\u003e\n\u003cbr /\u003e\nNow SQL Server optimizer generates an optimal query plan for parameter value ProductNumer \u003d 'FR%' or @ProductNumber \u003d 'SO-B909-L' respectively regardless the initial value used. Both different parameter value provided perform as expected with the optimal duration seen previously.\u0026nbsp;\n\u003cbr /\u003e\n\u003cbr /\u003e\nOne important note. As OPTION (RECOMPILE) recompile and regenerate new query plan on each execution, there is additional CPU overhead involved. Depending on how expensive is the query plan and the frequency the stored procedure is executed, RECOMPILE may or may not be a suitable solution Also, since the cache plan is discarded after execution, it may not show up on SQL Server DMV for monitoring or troubleshooting purpose.\u003cbr /\u003e\n\u003cbr /\u003e\nNow we learn that OPTION (RECOMPILE) is useful for the query to perform optimally for different parameter value, what happen if the query could not be modified due to multiple reasons, like the stored procedure is provided by vendor or ORM generated query?\u003cbr /\u003e\n\u003cbr /\u003e\nSQL Server plan guides could be useful in these type of scenario when we have our hand tied trying to address the query performance. Let's see how it works.\u003cbr /\u003e\n\u003cbr /\u003e\nFirst remove the query hint on the stored procedure.\n\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eALTER PROCEDURE sp_ProductOrderDetail \n(\n  @ProductNumber nvarchar(25),\n  @RecodOffset int,\n  @RecordNumber int\n)\nAS\nBEGIN\nSELECT s.SalesOrderNumber, p.Name, d.OrderQty, d.LineTotal\nFROM sales.SalesOrderHeaderEnlarged s\nJOIN sales.SalesOrderDetailEnlarged d\n ON d.SalesOrderID \u003d s.SalesOrderID\nJOIN Production.Product p\n ON p.ProductID \u003d d.ProductID\nWHERE p.ProductNumber LIKE @ProductNumber\nORDER BY s.SalesOrderNumber \nOFFSET @RecodOffset ROWS\nFETCH NEXT @RecordNumber ROWS ONLY;\nEND\n\u003c/pre\u003e\n\u003cbr /\u003e\nUsing UI to create a plan guide for this query in sp_ProductOrderDetail stored procedure.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-J8lHiVuHTjw/VUfkopyymyI/AAAAAAAADhc/H2CluahaKZc/s1600/planguide-general.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"320\" src\u003d\"http://1.bp.blogspot.com/-J8lHiVuHTjw/VUfkopyymyI/AAAAAAAADhc/H2CluahaKZc/s320/planguide-general.png\" width\u003d\"260\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-PUF-jI_fslQ/VUfko9QEPvI/AAAAAAAADho/jS7ui7pT5SQ/s1600/planguide-sp.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"400\" src\u003d\"http://2.bp.blogspot.com/-PUF-jI_fslQ/VUfko9QEPvI/AAAAAAAADho/jS7ui7pT5SQ/s400/planguide-sp.png\" width\u003d\"374\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cb\u003e\u003cbr /\u003e\u003c/b\u003e\nOr use SQL Statement to create the plan guide.\n\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eUSE [AdventureWorks2014]\nGO\n\nEXEC sp_create_plan_guide @name \u003d N'[PlanGuide_sp_ProductOrderDetailPlanGuide]', \n@stmt \u003d N'SELECT s.SalesOrderNumber, p.Name, d.OrderQty, d.LineTotal\nFROM sales.SalesOrderHeaderEnlarged s\nJOIN sales.SalesOrderDetailEnlarged d\n ON d.SalesOrderID \u003d s.SalesOrderID\nJOIN Production.Product p\n ON p.ProductID \u003d d.ProductID\nWHERE p.ProductNumber LIKE @ProductNumber\nORDER BY s.SalesOrderNumber \nOFFSET @RecodOffset ROWS\nFETCH NEXT @RecordNumber ROWS ONLY;', \n@type \u003d N'OBJECT', \n@module_or_batch \u003d N'[dbo].[sp_ProductOrderDetail]', \n@hints \u003d N'OPTION (RECOMPILE)'\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\nExecute the stored procedure with different parameter and examine the query plan. One way to verify if the plan guide is being used on the execution by right clicking on the SELECT node, click properties and examine the plan guide name value.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-mHhiecXBDQY/VUfkop6a7aI/AAAAAAAADhw/64ViDo_9WjA/s1600/sp-planguide-select.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"71\" src\u003d\"http://1.bp.blogspot.com/-mHhiecXBDQY/VUfkop6a7aI/AAAAAAAADhw/64ViDo_9WjA/s320/sp-planguide-select.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-VnPHyiIqwKU/VUfkpS2I06I/AAAAAAAADh4/_6Ks-M_Qxfs/s1600/sp-planguide-setting.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"400\" src\u003d\"http://3.bp.blogspot.com/-VnPHyiIqwKU/VUfkpS2I06I/AAAAAAAADh4/_6Ks-M_Qxfs/s400/sp-planguide-setting.png\" width\u003d\"390\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nWhat about if the SQL query is not a stored procedure but rather a query from ORM application. Something like this,\n\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003esp_executesql \nN'SELECT s.SalesOrderNumber, p.Name, d.OrderQty, d.LineTotal\nFROM sales.SalesOrderHeaderEnlarged s\nJOIN sales.SalesOrderDetailEnlarged d\n ON d.SalesOrderID \u003d s.SalesOrderID\nJOIN Production.Product p\n ON p.ProductID \u003d d.ProductID\nWHERE p.ProductNumber LIKE @ProductNumber\nORDER BY s.SalesOrderNumber \nOFFSET @RecodOffset ROWS\nFETCH NEXT @RecordNumber ROWS ONLY;', \nN'@ProductNumber nvarchar(25), @RecodOffset int, @RecordNumber int', \n@ProductNumber \u003d 'SO-B909-L', @RecodOffset \u003d 0, @RecordNumber \u003d 1000;\n\u003c/pre\u003e\n\u003cbr /\u003e\nOR with sql_prepare \n\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eDECLARE @p1 int\nEXEC sp_prepare @p1 output, N'@ProductNumber nvarchar(25), @RecodOffset int, @RecordNumber int', \nN'SELECT s.SalesOrderNumber, p.Name, d.OrderQty, d.LineTotal\nFROM sales.SalesOrderHeaderEnlarged s\nJOIN sales.SalesOrderDetailEnlarged d\n ON d.SalesOrderID \u003d s.SalesOrderID\nJOIN Production.Product p\n ON p.ProductID \u003d d.ProductID\nWHERE p.ProductNumber LIKE @ProductNumber\nORDER BY s.SalesOrderNumber \nOFFSET @RecodOffset ROWS\nFETCH NEXT @RecordNumber ROWS ONLY;';\n\nEXEC sp_execute @p1, 'SO-B909-L', @RecodOffset \u003d 0, @RecordNumber \u003d 1000;\n--EXEC sp_unbindefault @p1;\n\u003c/pre\u003e\n\u003cbr /\u003e\nFrom the verification step earlier, the query plan generated is not using the query hint specified in the plan guide. Plan guide only applied when very specific conditions are met like in this case both the scope as well as the SQL statement. The plan guide created earlier is only scoped for the sp_ProductOrderDetail stored procedure. A proper plan guide needs to be created for this SQL query.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-ADYASL8or9o/VUflHKTZAFI/AAAAAAAADh8/Jqu5dd1Y1AI/s1600/planguide-query.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"400\" src\u003d\"http://4.bp.blogspot.com/-ADYASL8or9o/VUflHKTZAFI/AAAAAAAADh8/Jqu5dd1Y1AI/s400/planguide-query.png\" width\u003d\"374\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eUSE [AdventureWorks2014]\nGO\n\nEXEC sp_create_plan_guide @name \u003d N'[PlanGuide_Query1]', \n@stmt \u003d N'SELECT s.SalesOrderNumber, p.Name, d.OrderQty, d.LineTotal\nFROM sales.SalesOrderHeaderEnlarged s\nJOIN sales.SalesOrderDetailEnlarged d\n ON d.SalesOrderID \u003d s.SalesOrderID\nJOIN Production.Product p\n ON p.ProductID \u003d d.ProductID\nWHERE p.ProductNumber LIKE @ProductNumber\nORDER BY s.SalesOrderNumber \nOFFSET @RecodOffset ROWS\nFETCH NEXT @RecordNumber ROWS ONLY;', \n@type \u003d N'SQL', \n@params \u003d N'@ProductNumber nvarchar(25), @RecodOffset int, @RecordNumber int', \n@hints \u003d N'OPTION (RECOMPILE)'\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\nNow, the SQL query execute with the OPTION (RECOMPILE) query as specified in the plan guide. The setting also shows that plan guide is being used.\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-AB9A74mpTx4/VUflHMYydHI/AAAAAAAADiI/810-sjkWKqs/s1600/query-planguide-setting.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"400\" src\u003d\"http://4.bp.blogspot.com/-AB9A74mpTx4/VUflHMYydHI/AAAAAAAADiI/810-sjkWKqs/s400/query-planguide-setting.png\" width\u003d\"371\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nTo view the existing plan guide from UI or using\u0026nbsp;\u003ca href\u003d\"https://msdn.microsoft.com/en-us/library/ms178010.aspx\" target\u003d\"_blank\"\u003esys.plan_guides\u003c/a\u003e dynamic management view\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-6G_00YiedCs/VUfmAyU549I/AAAAAAAADiU/gxQIdv50-x8/s1600/planguide.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://2.bp.blogspot.com/-6G_00YiedCs/VUfmAyU549I/AAAAAAAADiU/gxQIdv50-x8/s1600/planguide.png\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nPlan guide is only available on enterprise edition. Plan guide should be always used sparingly and only under very special circumstances.\u003cbr /\u003e\n\u003cbr /\u003e\nHopefully you find this post on query hint and the plan guide helpful."},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/85337532743026677/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2015/05/query-hint-and-plan-guide.html#comment-form","title":"0 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/85337532743026677"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/85337532743026677"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2015/05/query-hint-and-plan-guide.html","title":"Query Hint and Plan Guide"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"media$thumbnail":{"xmlns$media":"http://search.yahoo.com/mrss/","url":"http://4.bp.blogspot.com/-Lu3yoXoP5Fg/VUhHnuKacMI/AAAAAAAADi0/54aR8Hb5TXk/s72-c/guide.jpg","height":"72","width":"72"},"thr$total":{"$t":"0"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-5048842932502111613"},"published":{"$t":"2015-03-30T05:00:00.000-05:00"},"updated":{"$t":"2015-03-30T11:33:32.060-05:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"Performance Tuning"}],"title":{"type":"text","$t":"Query Performance Tuning Example"},"content":{"type":"html","$t":"\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-TfBFYp7hU40/VRWyyEtyNQI/AAAAAAAADdk/UZRLHa2sZls/s1600/tuning.jpg\" imageanchor\u003d\"1\" style\u003d\"clear: left; float: left; margin-bottom: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://3.bp.blogspot.com/-TfBFYp7hU40/VRWyyEtyNQI/AAAAAAAADdk/UZRLHa2sZls/s1600/tuning.jpg\" height\u003d\"80\" width\u003d\"120\" /\u003e\u003c/a\u003e\u003c/div\u003e\nAs DBA, application performance issue and complaints often thrown our way. It is often fun to troubleshoot those performance issue. When it has been identified that the problem lies on SQL Server, it get even more exciting. Here is one of the example of how I tuned a query a while ago and also some lessons learned.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca name\u003d'more'\u003e\u003c/a\u003eI received a request to identify bottlenecks of an application where its users has reported in a survey that they have been experienced slow application response in their daily operations. Without any information of which specific module or functionality that experience the 'problem', I did some digging and notice there were different types of queries and stored procedures developed in this application to serve both operational as well as a reporting purpose.\u003cbr /\u003e\n\u003cbr /\u003e\nUsing one of my own query below, I look up the most run (Order by COUNT) query as well as the one that take the longest times (Order by TIME).\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eDECLARE @TopNumber smallint \u003d 20;\nDECLARE @OrderByField varchar(20) \u003d 'COUNT';\n\n--Top 10 CPU Usage Query\n--Declare @TopNumber smallint \u003d 10;\n--DECLARE @OrderByField varchar(20) \u003d 'CPU';\n\n-- CPU   - avg_cpu_time\n-- TIME   - avg_elapsed_time\n-- READ   - avg_physical_read + avg_logical_read\n-- WRITE  - avg_logical_write\n-- COUNT  - cache_ob_lookup\n-- RECOMPILE  - avg_ob_recompile\n-- PARALLEL  - avg_over_parallel\n\nWITH query_stat AS \n(SELECT TOP(@TopNumber)\n  SUM(qt.plan_generation_num) / SUM(cp.usecounts) AS avg_ob_recompile,\n  SUM(cp.usecounts) AS total_exec_count,\n  SUM(qt.total_worker_time) / SUM(qt.execution_count) / 1000 AS avg_cpu_time,\n  SUM(qt.total_elapsed_time) / SUM(qt.execution_count) / 1000 AS avg_elapsed_time,\n  SUM(qt.total_logical_writes) / SUM(qt.execution_count) AS avg_logical_write,\n  SUM(qt.total_logical_reads) / SUM(qt.execution_count) AS avg_logical_read,\n  SUM(qt.total_physical_reads) / SUM(qt.execution_count) AS avg_physical_read,\n  (SUM(qt.total_worker_time) - SUM(qt.total_elapsed_time)) / SUM(qt.execution_count) / 1000 AS avg_over_parallel,\n  MIN(qt.sql_handle) AS sql_handle,\n  qt.query_hash\nFROM sys.dm_exec_cached_plans cp WITH (NOLOCK)\nINNER JOIN sys.dm_exec_query_stats AS qt WITH (NOLOCK) ON\n cp.plan_handle \u003d qt.plan_handle\nCROSS APPLY sys.dm_exec_sql_text(qt.sql_handle) st \nGROUP BY qt.query_hash\nORDER BY \n (CASE @OrderByField \n  WHEN 'CPU' THEN SUM(qt.total_worker_time) / SUM(qt.execution_count)\n  WHEN 'READ' THEN SUM(qt.total_logical_reads + total_physical_reads) / SUM(qt.execution_count)\n  WHEN 'WRITE' THEN SUM(qt.total_logical_writes) / SUM(qt.execution_count)\n  WHEN 'COUNT' THEN SUM(cp.usecounts)\n  WHEN 'TIME' THEN SUM(qt.total_elapsed_time) / SUM(qt.execution_count)\n  WHEN 'RECOMPILE' THEN SUM(qt.plan_generation_num) / SUM(cp.usecounts)\n  WHEN 'PARALLEL' THEN (SUM(qt.total_worker_time) - SUM(qt.total_elapsed_time)) / SUM(qt.execution_count)\n END) DESC\n)\n\nSELECT TOP(@TopNumber)  \n DB_NAME(st.dbid) AS database_name,\n OBJECT_NAME(st.objectid, st.dbid) AS proc_name,\n cp.cacheobjtype,\n cp.objtype,\n qs.avg_ob_recompile,\n total_exec_count,\n avg_cpu_time,\n avg_elapsed_time,\n avg_logical_write,\n avg_logical_read,\n avg_physical_read,\n avg_over_parallel,\n SUBSTRING(st.text, (qt.statement_start_offset/2) + 1,\n   ((CASE statement_end_offset \n    WHEN -1 THEN DATALENGTH(st.text)\n    ELSE qt.statement_end_offset \n     END - qt.statement_start_offset)/2) + 1) AS sql_text, \n st.text AS ob_text,\n ph.query_plan\nFROM query_stat AS qs\nINNER JOIN sys.dm_exec_query_stats AS qt WITH (NOLOCK) ON\n qs.query_hash \u003d qt.query_hash AND\n qs.sql_handle \u003d qt.sql_handle\nINNER JOIN sys.dm_exec_cached_plans cp WITH (NOLOCK) ON\n cp.plan_handle \u003d qt.plan_handle\nCROSS APPLY sys.dm_exec_sql_text(qt.sql_handle) AS st\nCROSS APPLY sys.dm_exec_query_plan(qt.plan_handle) AS ph\nOPTION (RECOMPILE);\u003c/pre\u003e\n\u003cbr /\u003e\nBy analyzing the two different set of data, I was able to find some correlation and identified some frequently called slow queries. I decided to focus on the 'operational' queries as that is more aligned with the user complaint, With some help from developer to filter out the reporting queries, I was able to identify one stored procedure that appear to have significant impact on user operational task experience.\u003cbr /\u003e\n\u003cbr /\u003e\nHere is the target, dbo.SPBuildFastIndex\u003cbr /\u003e\n\u003cbr /\u003e\nThe stored procedure appears to insert new entry into a table when there a new entry, and in addition to that, it could also be executed to delete all entries in the table and reinsert all data. This stored procedure is one of the most run stored procedure with higher than average duration for its execution. From the system cached data, the stored procedure executed around 3000 times a day. From the statistics, it appears that there are certain SQL statements within the stored procedure take an average of 20 seconds to complete in production and some instances of that certain entries take between 1-2 minutes to complete.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eALTER PROCEDURE dbo.SPBuildFastIndex (@ObjectID int)\nAS\n...\nINSERT INTO FastIndexTbl\n(LocID, FastIndexOTID, TransID, NameData, Info, Timestamp)\n(\n SELECT \n  t.LocID, \n  100, \n  t.TransID, \n  dbo.NameStr(b.FirstName, b.MiddleName, b.LastName, b.CompanyName),\n  CONVERT(varchar(50),d.DetailNumber) + ', ' + \n  ISNULL(dbo.NameStr(i.FirstName, i.MiddleName, i.LastName, NULL), 'No Data'), \n  ISNULL(CAST(o.OpsDate AS datetime), ISNULL(t.TransDate, '1/1/1900'))\n FROM Trans t \n INNER JOIN Detail d ON t.TransID\u003dd.TransID\n INNER JOIN Ind i ON d.DetailInd\u003di.IndID\n INNER JOIN Acct a ON t.AcctID\u003da.AcctID\n INNER JOIN Ind b ON b.IndID\u003da.CustID\n INNER JOIN [Ops] o ON d.DetailID\u003do.DetailID\n WHERE \n  d.DetailTypeID \u003d 100  \n  AND dbo.NameStr(b.FirstName, b.MiddleName, b.LastName, b.CompanyName) IS NOT NULL\n  AND t.TransID \u003d CASE WHEN @ObjectID \u003d 0 THEN t.TransID ELSE @ObjectID END\n);\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nI picked the slow SQL Statement out for more isolated analysis and execute the select statement with the ObjectID parameter specified (eg. 100001),\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003e--Turn on statistics to display IO and time information of the execution\nSET STATISTICS IO ON;\nSET STATISTICS TIME ON;\n\u003c/pre\u003e\n\u003cbr /\u003e\nI execute these commands below to clear the cache each time I made changes for better comparison\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003e--Delete cache and data in memory for comparison of each change\nCHECKPOINT;\nDBCC DROPCLEANBUFFERS;\nDBCC FREEPROCCACHE;\n\u003c/pre\u003e\n\u003cbr /\u003e\nHere is the execution plan of the query (view from \u003ca href\u003d\"http://www.sqlsentry.com/products/plan-explorer/sql-server-query-view\" target\u003d\"_blank\"\u003eSQL Sentry Plan Explorer\u003c/a\u003e). Click on the picture for larger view.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-EBMbD6jpWQw/VRWZNO3DPWI/AAAAAAAADcc/fsrkcJTOYeY/s1600/Original_Performance.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://4.bp.blogspot.com/-EBMbD6jpWQw/VRWZNO3DPWI/AAAAAAAADcc/fsrkcJTOYeY/s1600/Original_Performance.png\" height\u003d\"200\" width\u003d\"500\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nHere is some of the IO and time stat from cold cache.\u003cbr /\u003e\n\u003cbr /\u003e\nTable 'Worktable'. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.\u003cbr /\u003e\nTable 'Acct'. Scan count 1, logical reads 22357, physical reads 20, read-ahead reads 22739, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.\u003cbr /\u003e\nTable 'Ops'. Scan count 1, logical reads 12541, physical reads 27, read-ahead reads 12555, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.\u003cbr /\u003e\nTable 'Detail'. Scan count 1, logical reads 21372, physical reads 24, read-ahead reads 21437, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.\u003cbr /\u003e\nTable 'Trans'. Scan count 1, logical reads 53846, physical reads 76, read-ahead reads 53872, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.\u003cbr /\u003e\nTable 'Ind'. Scan count 2, logical reads 282086, physical reads 205, read-ahead reads 141176, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.\u003cbr /\u003e\n\u003cbr /\u003e\n\u0026nbsp;SQL Server Execution Times:\u003cbr /\u003e\n\u0026nbsp; \u0026nbsp;CPU time \u003d 34203 ms, \u0026nbsp;elapsed time \u003d 43402 ms\u003cbr /\u003e\n\u003cbr /\u003e\nBetter view from \u003ca href\u003d\"http://statisticsparser.com/\" target\u003d\"_blank\"\u003eStatistics Parser\u003c/a\u003e\u0026nbsp;by Richie Rump.\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-eqs5ndv2UZU/VRW1JyFlBuI/AAAAAAAADdw/nrMzzeUA5fE/s1600/Original_Statistics.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://4.bp.blogspot.com/-eqs5ndv2UZU/VRW1JyFlBuI/AAAAAAAADdw/nrMzzeUA5fE/s1600/Original_Statistics.png\" height\u003d\"200\" width\u003d\"500\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nHere is the IO and time statistics from warm cache (data available in memory)\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-2f6eg_NSGFc/VRW17KSAZ8I/AAAAAAAADd4/WoqKavOvw1I/s1600/Original_Statistic_Warms.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://4.bp.blogspot.com/-2f6eg_NSGFc/VRW17KSAZ8I/AAAAAAAADd4/WoqKavOvw1I/s1600/Original_Statistic_Warms.png\" height\u003d\"200\" width\u003d\"500\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nThe duration of reading the data from memory reduce around 10 seconds.\u003cbr /\u003e\n\u003cbr /\u003e\nThe elapsed time for the warm cache likely to illustrate what is experiencing by end user. However, I would be using the cold cache data for better comparison with other improvement options.\u003cbr /\u003e\n\u003cbr /\u003e\nBelow were a few methods I tried to identify the level of improvements.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eIndex implementation\u003c/b\u003e\u003cbr /\u003e\nThe IO stat as well as the execution plan show that the dbo.Ind table returns and process large number of pages. Mainly because this table is wide and there is no useful narrower index to be used to fulfill this query. Other index scan operators are chosen likely due to the number of rows required to return for process. Implementing some indexes could help reduce number of pages required and returned.\u003cbr /\u003e\n\u003cbr /\u003e\nI examined the tables for redundant index as well as reusable index. I found\u0026nbsp;\u003ca href\u003d\"http://www.brentozar.com/blitzindex/\" target\u003d\"_blank\"\u003esp_BlitzIndex\u003c/a\u003e from Brent Ozar Unlimited makes the index examination easier and faster. I cleaned up some redundant indexes and decided to create some indexes. Eg. on dbo.ind table.\u003cbr /\u003e\n\u003cbr /\u003e\nCREATE INDEX [IX_Ind_IndlID_Includes]\u003cbr /\u003e\nON [dbo].[Ind] ([IndID]) INCLUDE ([FirstName], [MiddleName], [LastName], [CompanyName]);\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-tFnQbWWGocg/VRWZIbVRaEI/AAAAAAAADcM/9DZh943xpM0/s1600/Index_Implementation.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://4.bp.blogspot.com/-tFnQbWWGocg/VRWZIbVRaEI/AAAAAAAADcM/9DZh943xpM0/s1600/Index_Implementation.png\" height\u003d\"200\" width\u003d\"500\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-Ho6dQnVbNqU/VRW2VLB50pI/AAAAAAAADeA/igY6VrnicZU/s1600/Statistics_Index.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://4.bp.blogspot.com/-Ho6dQnVbNqU/VRW2VLB50pI/AAAAAAAADeA/igY6VrnicZU/s1600/Statistics_Index.png\" height\u003d\"200\" width\u003d\"500\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nThe duration reduce around 10 seconds with the index implementation from cold cache. Total pages read reduces from ~392K to 47K.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eScalar User Defined Function and Inline Table Valued Function\u003c/b\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nThe SQL statement uses a scalar user defined function (UDF) dbo.NameStr. The function is as innocent as concatenating first, middle, last name and sometimes company name into one string.\u003cbr /\u003e\n\u003cbr /\u003e\nUnderstanding that this function is implemented to make code maintenance simpler and avoid the need to rewrite the same script every times when there is a need to concatenate the name into one string/column.\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eCREATE FUNCTION [dbo].[NameStr]\n(  \n  @FirstName varchar(50), @MiddleName varchar(50), @LastName varchar(50), @CompanyName varchar(100) \u003d ''\n)\nRETURNS varchar(150)\nAS\nBEGIN\n\nDECLARE @ret varchar(150)\n\nIF IsNull(@CompanyName,'') \u003d ''\n  SET @ret \u003d @LastName + CASE WHEN IsNull(@FirstName, '') \u0026lt;\u0026gt; '' THEN ', ' ELSE '' END + @FirstName + CASE WHEN @MiddleName is NULL THEN '' ELSE ' ' + @MiddleName END\nELSE\n  SET @ret \u003d @CompanyName\n\nRETURN @ret \n\nEND\u003c/pre\u003e\n\u003cbr /\u003e\nHowever, scalar function has some disadvantages.\u003cbr /\u003e\n\u003cbr /\u003e\n1) When scalar UDF is used in a query, SQL Server invoke this function as a separate module for each row within the iterator. This introduces overhead from each invocation of the module of each row\u003cbr /\u003e\n2) Scalar UDF estimated cost is 0 which is not true and it affects how SQL optimizer choose its optimal plan\u003cbr /\u003e\n3) Scalar UDF does not use parallelism in a plan\n\u003cbr /\u003e\n\u003cbr /\u003e\nSQL Profiler trace show that the scalar function is invoked and executed for each row.\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003c/div\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-FnxYhw1A-Js/VRW6fFXTzWI/AAAAAAAADes/sZ1MioQJz3s/s1600/Profiler_ScalarFunction.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://4.bp.blogspot.com/-FnxYhw1A-Js/VRW6fFXTzWI/AAAAAAAADes/sZ1MioQJz3s/s1600/Profiler_ScalarFunction.png\" height\u003d\"200\" width\u003d\"500\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nSQL plan show that the cost of the function is 0 which is not true\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-cK1wr2yCUcg/VRWZGxiDBOI/AAAAAAAADcA/bl29VED21v4/s1600/Cost_ScalarFunction.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://4.bp.blogspot.com/-cK1wr2yCUcg/VRWZGxiDBOI/AAAAAAAADcA/bl29VED21v4/s1600/Cost_ScalarFunction.png\" height\u003d\"204\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nAll the execution plan so far has been with no parallelism involved.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003c/div\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-K-fJO6sEz-4/VRWlUJDqq3I/AAAAAAAADdU/brPpk5CUrks/s1600/Serial_ScalarFunction.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://1.bp.blogspot.com/-K-fJO6sEz-4/VRWlUJDqq3I/AAAAAAAADdU/brPpk5CUrks/s1600/Serial_ScalarFunction.png\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nThere are few ways to remedy this particular case,\u003cbr /\u003e\n\u003cbr /\u003e\n1) Rewrite query to include the logic in the query itself, or\u003cbr /\u003e\n2) Replace with inline table valued function\u003cbr /\u003e\n\u003cbr /\u003e\nIf the decision is to use inline table valued function (TVF), an inline TVF can be created with very similar syntax as the existing scalar UDF, but instead of returning a scalar value, it returns a table with single value (in this case). Please refer to reference for the exact syntax. Here is an example of a inline TVF for this purpose,\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eCREATE FUNCTION [dbo].[NameStrTable]\n( \n  @FirstName varchar(50), @MiddleName varchar(50), @LastName varchar(50), @CompanyName varchar(100) \u003d ''\n)\nRETURNS TABLE\nAS\nRETURN \n…\n);\n\u003c/pre\u003e\n\u003cbr /\u003e\nReplace any dbo.NameStr(FirstName, MiddleName, LastName, Company) with (SELECT Name FROM dbo.NameStrTable(FirstName, MiddleName, LastName, Company))\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eSELECT \n t.LocID, \n 100, \n t.TransID, \n (SELECT Name FROM dbo.NameStrTable(b.FirstName, b.MiddleName, b.LastName, b.CompanyName)),\n CONVERT(varchar(50),d.DetailNumber) + ', ' + \n ISNULL((SELECT Name FROM dbo.NameStrTable(i.FirstName, i.MiddleName, i.LastName, NULL)), 'No Data'), \n ISNULL(CAST(o.OpsDate AS datetime), ISNULL(t.TransDate, '1/1/1900'))\nFROM Trans t \nINNER JOIN Detail d ON t.TransID\u003dd.TransID\nINNER JOIN Ind i ON d.DetailID\u003di.IndID\nINNER JOIN Acct a ON t.AcctID\u003da.AcctID\nINNER JOIN Ind b ON b.IndID\u003da.CustID\nINNER JOIN [Ops] o ON d.DetailID\u003do.DetailID\nWHERE \n d.DetailTypeID \u003d 100  \n AND (SELECT Name FROM dbo.NameStrTable(b.FirstName, b.MiddleName, b.LastName, b.CompanyName)) IS NOT NULL\n AND t.TransID \u003d CASE WHEN @ObjectID \u003d 0 THEN t.TransID ELSE @ObjectID END;\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nThe execution plan below show that SQL Server optimizer chose a parallel plan for this execution. In this case the degree of parallelism is 4.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-ZTDxzdqozuY/VRWZQBkvohI/AAAAAAAADck/N2Z4s95wMvM/s1600/Parallel_TableFunction.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://2.bp.blogspot.com/-ZTDxzdqozuY/VRWZQBkvohI/AAAAAAAADck/N2Z4s95wMvM/s1600/Parallel_TableFunction.png\" height\u003d\"200\" width\u003d\"500\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-HlgjFIXdMLg/VRW3hnWe1cI/AAAAAAAADeI/xHwq-0Fw9KU/s1600/Statistics_Function.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://3.bp.blogspot.com/-HlgjFIXdMLg/VRW3hnWe1cI/AAAAAAAADeI/xHwq-0Fw9KU/s1600/Statistics_Function.png\" height\u003d\"200\" width\u003d\"500\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nThe duration reduces about 30 seconds from cold cache. The execution was performed without any new index implemented previously.\u003cbr /\u003e\n\u003cbr /\u003e\nFrom the SQL Profiler, it appears that SQL Server does not invoke the inline TVF for each row like scalar UDF and hence avoid the overheads that scalar UDF were experiencing.\u003cbr /\u003e\n\u003cdiv\u003e\n\u003cbr /\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-ZhocOD88_60/VRWZUUTX0LI/AAAAAAAADdA/psD_i30KWjY/s1600/Profiler_TableFunction.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://3.bp.blogspot.com/-ZhocOD88_60/VRWZUUTX0LI/AAAAAAAADdA/psD_i30KWjY/s1600/Profiler_TableFunction.png\" height\u003d\"41\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nThe combination of having the function in line with the query for better cost estimation, parallel plan and prevention of overhead cost of invoking the function for each row allow the query to perform much faster.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003e\u003cstrike\u003eParameter Sniffing\u003c/strike\u003e CASE Condition and Parameter Embedding Optimization\u003c/b\u003e\u003cbr /\u003e\nI will get to the reason why I cross out parameter sniffing in a minute.\u003cbr /\u003e\n\u003cbr /\u003e\nIf you remember, at the beginning of the review, we discussed about the purpose of this stored procedure. In order to build the fast index to speed up search lookup, any new entry entered executed this stored procedure to insert a record into the search index, \u003cb\u003ein addition\u003c/b\u003e this stored procedure could be also used build a new one from scratch.\n\u003cbr /\u003e\n\u003cbr /\u003e\nThe key part is that this stored procedure is serving two purposes. In order to do that, there is the CASE statement at the search condition (where clause)\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003et.TransID \u003d CASE WHEN @ObjectID \u003d 0 THEN t.TransID ELSE @ObjectID END;\n\u003c/pre\u003e\n\u003cbr /\u003e\nWhen a zero is passed in as ObjectID parameter, the query filter all rows in sales table along with other condition (e.g. other joins and condition); If a specific ObjectID is passed in, it only filter that specified sales item in sales table along with other condition (e.g. other joins and search condition).\u003cbr /\u003e\n\u003cbr /\u003e\nThere are about 1.5 million rows in sales table. When the ObjectID parameter is zero, SQL Server needs to read all these 1.5 mil rows (if this is the only search condition) along with the relevant data required from other tables. SQL Server is likely to choose a scan operator along with merge or hash joins as they are more efficient due to the large rows to be processed. On the other hand, if a specified ObjectID parameter value is specified (non-zero), SQL Server only need to read this one row along with the data required from other tables. Since the SaleID is a clustered key, SQL Server is likely to choose a seek operator along with nested joins which is more efficient for small row set.\u003cbr /\u003e\n\u003cbr /\u003e\nSince this two search conditions are applied as a CASE statement, SQL Server would need to evaluate this two conditions and choose a plan that is good enough for both conditions. In this case, the plan shown previously was chosen. Mostly scan operators along with merge and hash joins.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-bkPGnt1SrwU/VRWZGaKwlkI/AAAAAAAADb8/3zZb3MEe5Ng/s1600/CASE_Predicate.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://3.bp.blogspot.com/-bkPGnt1SrwU/VRWZGaKwlkI/AAAAAAAADb8/3zZb3MEe5Ng/s1600/CASE_Predicate.png\" height\u003d\"141\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nSo what is the performance difference of these two plans? Without any index implementation and function replacement, here is the statistics with current plan (statistics same with what shown previously)\u003cbr /\u003e\n\u003cbr /\u003e\nPlan SQL Server chose,\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-EBMbD6jpWQw/VRWZNO3DPWI/AAAAAAAADcc/fsrkcJTOYeY/s1600/Original_Performance.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://4.bp.blogspot.com/-EBMbD6jpWQw/VRWZNO3DPWI/AAAAAAAADcc/fsrkcJTOYeY/s1600/Original_Performance.png\" height\u003d\"200\" width\u003d\"500\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-eqs5ndv2UZU/VRW1JyFlBuI/AAAAAAAADdw/nrMzzeUA5fE/s1600/Original_Statistics.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://4.bp.blogspot.com/-eqs5ndv2UZU/VRW1JyFlBuI/AAAAAAAADdw/nrMzzeUA5fE/s1600/Original_Statistics.png\" height\u003d\"200\" width\u003d\"500\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nWith optimal plan,\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-5zrxLhmt4fk/VRWZL1BIYtI/AAAAAAAADcU/Mzc93MdaHds/s1600/Optimal_Plan.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://4.bp.blogspot.com/-5zrxLhmt4fk/VRWZL1BIYtI/AAAAAAAADcU/Mzc93MdaHds/s1600/Optimal_Plan.png\" height\u003d\"200\" width\u003d\"500\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-HRO4zRLun6w/VRW4CzCmudI/AAAAAAAADeQ/7Sjp0H363lg/s1600/Statistics_Condition.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://2.bp.blogspot.com/-HRO4zRLun6w/VRW4CzCmudI/AAAAAAAADeQ/7Sjp0H363lg/s1600/Statistics_Condition.png\" height\u003d\"200\" width\u003d\"500\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nWow. the huge performance difference between these two execution plans. This optimal execution plan takes less than a second. Even without any new indexes, the CPU and IO demand is very low compare to the current plan as it is able to seek that particular record through the index seek operation.\u003cbr /\u003e\n\u003cbr /\u003e\nAs we know now that the current plan is not an optimal plan for single sales item due to the CASE statement, there are a few ways to remedy it. Before that, I want to discuss the topic on the parameter sniffing. When a parameter is passed in the first time, SQL Server evaluates the parameter to choose and generate a compiled plan, execute with that plan and then cache it for reuse. Parameter sniffing occurs when the plan that was chosen and cached is not optimal for subsequent parameter passed in. This usually occur when the data is not evenly distributed in the table. It may seem like the ObjectID with zero was passed in the first time and caused SQL Server to generate a less optimal plan for subsequent specified ObjectID execution. However, this is not the case here as SQL Server evaluates both condition in the CASE statement to generate a plan, regardless if the zero or specified value (non-zero) is passed in the first time. In addition, since the SaleID is an unique column, it allows accurate cardinality estimation and the same compiled plan should be optimal for different ObjectID value as long as it is not a zero.\u003cbr /\u003e\n\u003cbr /\u003e\nNow, let’s explore different ways to remedy this.\u003cbr /\u003e\n\u003cbr /\u003e\n1) OPTION (RECOMPILE)\u003cbr /\u003e\n2) Rewrite separate static query with IF statement\u003cbr /\u003e\n3) Rewrite as dynamic query with IF statement\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eOPTION (RECOMPILE)\u003c/b\u003e\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eSELECT \n t.LocID, \n 100, \n t.TransID, \n dbo.NameStr(b.FirstName, b.MiddleName, b.LastName, b.CompanyName),\n CONVERT(varchar(50),d.DetailNumber) + ', ' + \n ISNULL(dbo.NameStr(i.FirstName, i.MiddleName, i.LastName, NULL), 'No Data'), \n ISNULL(CAST(o.OpsDate AS datetime), ISNULL(t.TransDate, '1/1/1900'))\nFROM Trans t \nINNER JOIN Detail d ON t.TransID\u003dd.TransID\nINNER JOIN Ind i ON d.DetailInd\u003di.IndID\nINNER JOIN Acct a ON t.AcctID\u003da.AcctID\nINNER JOIN Ind b ON b.IndID\u003da.CustID\nINNER JOIN [Ops] o ON d.DetailID\u003do.DetailID\nWHERE \n d.DetailTypeID \u003d 100  \n AND dbo.NameStr(b.FirstName, b.MiddleName, b.LastName, b.CompanyName) IS NOT NULL\n AND t.TransID \u003d CASE WHEN @ObjectID \u003d 0 THEN t.TransID ELSE @ObjectID END\nOPTION (RECOMPILE);\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nAdding the OPTION (RECOMPILE) at the end of the statement force SQL Server to regenerate a new plan based on the parameter for every execution. In addition, it applies parameter embedding optimization that allows SQL parser to fully evaluating the CASE statement. Hence generating an optimal plan for both zero parameter and parameter with specified value. However, there is CPU overhead to recompile and generate a plan, and it may not be desirable especially for query that is executed frequently.\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-5HsOb4dARxo/VRWZQfIx_jI/AAAAAAAADco/LN7vXvv0TqA/s1600/Parameter_Embedding_Optimization.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://1.bp.blogspot.com/-5HsOb4dARxo/VRWZQfIx_jI/AAAAAAAADco/LN7vXvv0TqA/s1600/Parameter_Embedding_Optimization.png\" height\u003d\"100\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eStatic query with IF statement,\u003c/b\u003e\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eIF @ObjectID \u0026lt;\u0026gt; 0\nBEGIN\n SELECT \n  t.LocID, \n  100, \n  t.TransID, \n  dbo.NameStr(b.FirstName, b.MiddleName, b.LastName, b.CompanyName),\n  CONVERT(varchar(50),d.DetailNumber) + ', ' + \n  ISNULL(dbo.NameStr(i.FirstName, i.MiddleName, i.LastName, NULL), 'No Data'), \n  ISNULL(CAST(o.OpsDate AS datetime), ISNULL(t.TransDate, '1/1/1900'))\n FROM Trans t \n INNER JOIN Detail d ON t.TransID\u003dd.TransID\n INNER JOIN Ind i ON d.DetailInd\u003di.IndID\n INNER JOIN Acct a ON t.AcctID\u003da.AcctID\n INNER JOIN Ind b ON b.IndID\u003da.CustID\n INNER JOIN [Ops] o ON d.DetailID\u003do.DetailID\n WHERE \n  d.DetailTypeID \u003d 100  \n  AND dbo.NameStr(b.FirstName, b.MiddleName, b.LastName, b.CompanyName) IS NOT NULL\n  AND t.TransID \u003d @ObjectID;\nEND\nELSE\nBEGIN\n  SELECT \n  t.LocID, \n  100, \n  t.TransID, \n  dbo.NameStr(b.FirstName, b.MiddleName, b.LastName, b.CompanyName),\n  CONVERT(varchar(50),d.DetailNumber) + ', ' + \n  ISNULL(dbo.NameStr(i.FirstName, i.MiddleName, i.LastName, NULL), 'No Data'), \n  ISNULL(CAST(o.OpsDate AS datetime), ISNULL(t.TransDate, '1/1/1900'))\n FROM Trans t \n INNER JOIN Detail d ON t.TransID\u003dd.TransID\n INNER JOIN Ind i ON d.DetailInd\u003di.IndID\n INNER JOIN Acct a ON t.AcctID\u003da.AcctID\n INNER JOIN Ind b ON b.IndID\u003da.CustID\n INNER JOIN [Ops] o ON d.DetailID\u003do.DetailID\n WHERE \n  d.DetailTypeID \u003d 100  \n  AND dbo.NameStr(b.FirstName, b.MiddleName, b.LastName, b.CompanyName) IS NOT NULL\nEND\n\u003c/pre\u003e\n\u003cbr /\u003e\nThis method separates the two logics and allow SQL Server to generate, use and cache different plan for different search conditions. Once the compiled plan is cached, it could be reused again for next execution. This method may require the same select and join statements to be rewritten twice.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eDynamic query with IF statement\u003c/b\u003e\u003cbr /\u003e\n\u003cb\u003e\u003cbr /\u003e\u003c/b\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eDECLARE @sql nvarchar(max),\n  @parameter nvarchar(4000);\n\nSELECT @parameter \u003d '@transid int';\n\nSELECT @sql \u003d \n `SELECT \n  t.LocID, \n  100, \n  t.TransID, \n  dbo.NameStr(b.FirstName, b.MiddleName, b.LastName, b.CompanyName),\n  CONVERT(varchar(50),d.DetailNumber) + ', ' + \n  ISNULL(dbo.NameStr(i.FirstName, i.MiddleName, i.LastName, NULL), 'No Data'), \n  ISNULL(CAST(o.OpsDate AS datetime), ISNULL(t.TransDate, '1/1/1900'))\n FROM Trans t \n INNER JOIN Detail d ON t.TransID\u003dd.TransID\n INNER JOIN Ind i ON d.DetailInd\u003di.IndID\n INNER JOIN Acct a ON t.AcctID\u003da.AcctID\n INNER JOIN Ind b ON b.IndID\u003da.CustID\n INNER JOIN [Ops] o ON d.DetailID\u003do.DetailID\n WHERE \n  d.DetailTypeID \u003d 100  \n  AND dbo.NameStr(b.FirstName, b.MiddleName, b.LastName, b.CompanyName) IS NOT NULL`;\n\n  AND t.TransID \u003d CASE WHEN @ObjectID \u003d 0 THEN t.TransID ELSE @ObjectID END\n\nIF @ObjectID \u0026lt;\u0026gt; 0 SELECT @sql \u003d @sql + ' AND t.TransID \u003d @transid;';\nELSE SELECT @sql \u003d @sql + ';';\n\nEXEC sp_executesql @sql, @parameter, @ObjectID;\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nThis method separates the two search condition logics and allow SQL Server to generate, use and cache different plan (using sp_executesql) for different search conditions. Once the compiled plan is cached, each statement doesn't required to be recompiled again for next execution. The dynamic query allows the reuse of the same select and join statement during coding.  However, the use of dynamic query may add another layer of complexity of coding and troubleshooting.\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eObservation\u003c/b\u003e\u003cbr /\u003e\n\u003cb\u003eIndex Implementation\u003c/b\u003e\u003cbr /\u003e\nAs we have seen that some indexes do improve the performance for current plan (optimized for CASE condition) from cold cache. It reduces the number of pages required, that translates to lower IO and with that, it indirectly reduces the memory demand. \n\u003cbr /\u003e\n\u003cbr /\u003e\nFor this particular stored procedure, it does not appear that these new indexes will significantly improve the query performance that we are experiencing in production especially after the data have been cached in memory. However with an optimal query plan, some covering indexes could be helpful.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eReplacing Scalar Function\u003c/b\u003e\u003cbr /\u003e\nThe prevention of using parallel plan and the significant overhead incurred on scalar UDF could severely affect the performance when a large number of rows are involved like in this case. Although parallel plan is not always better than serial plan, in the case of large number of rows, it could be beneficial for multiple threads to split up the works. With small row set, the scalar UDF may not have significant impact as we have seen with the optimal plan.\n\u003cbr /\u003e\n\u003cbr /\u003e\nFrom the observation, replacing scalar UDF would improve the performance in production especially when large number of rows are involved. The only thing is that the function is being used in many different stored procedures. As not all stored procedure will replace it with UDF, changing this stored procedure to use a new UDF mean there will be two separate functions to maintain. May not be a big deal but all developers will need to take note of that.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eReplacing CASE statement\u003c/b\u003e\u003cbr /\u003e\nThe CASE statement at the search condition appears to be the main culprit of low performance of this query. SQL optimizer has chosen and generated a plan that is suboptimal for parameter with specified value (non-zero), which is most of the cases here.\n\u003cbr /\u003e\n\u003cbr /\u003e\nSince the situation is not caused by inaccurate cardinality estimation from uneven data distribution according to the parameter, recompilation on every execution may not be desired. There is only a need for two search condition; one with specified value and one for returning all rows. By rewriting the query with two separate logics allow optimal cached plan for each type of execution, and also prevent the overhead of recompiling the plan to generate an optimal plan for each execution.\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eLesson Learned\u003c/b\u003e\u003cbr /\u003e\nWhen performing query tuning, the query itself should be examined closely before performing any index changes. Index does help in many scenarios but should not be the first approach. Often, including this scenario, the performance of the query could be significantly improved just by changing the way it is written. With better understanding on different limitation and constraints (eg. scalar function) and how SQL Server Optimizer works would definitely help.\u003cbr /\u003e\n\u003cbr /\u003e"},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/5048842932502111613/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2015/03/query-performance-tuning-example.html#comment-form","title":"0 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/5048842932502111613"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/5048842932502111613"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2015/03/query-performance-tuning-example.html","title":"Query Performance Tuning Example"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"media$thumbnail":{"xmlns$media":"http://search.yahoo.com/mrss/","url":"http://3.bp.blogspot.com/-TfBFYp7hU40/VRWyyEtyNQI/AAAAAAAADdk/UZRLHa2sZls/s72-c/tuning.jpg","height":"72","width":"72"},"thr$total":{"$t":"0"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-7965098821287716546"},"published":{"$t":"2015-02-14T16:12:00.003-06:00"},"updated":{"$t":"2015-03-17T15:48:49.368-05:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"Self Learning"}],"title":{"type":"text","$t":"One Important Aspect - Mentor"},"content":{"type":"html","$t":"Looking back to my education and career journey, there were times when I felt like it was totally stagnant, and other times of leap and bound moment. I have came to realize the importance of a mentor.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca name\u003d'more'\u003e\u003c/a\u003eI have been fortunate to have some great teachers helping me to build a strong foundation in math, engineering and logical type of subjects in my school years. I know my stuff and it wasn't hard to get a good grade in these fields.\u003cbr /\u003e\n\u003cbr /\u003e\nWhen it comes to my choice of educational advancement oversea, that was my first taste of lack of concrete direction. There were some information on the internet and material provided in university. However, there all seem no difference to me at that time and I wasn't sure what to look for. It wasn't too long until I met the then head of degree transfer program who I worked part time for. He shared some of his experience, led me to some relevant resources based on my question and what I was trying to achieve. The next thing happened was I picked up my bag and flied to United States myself.\u003cbr /\u003e\n\u003cbr /\u003e\nFast forwarding to a few years ago when I was a manufacturing engineer. I found strong interest in .net and database development during company transitioning to MES (Manufacturing Enterprise System). I was surrounded by engineers and occasionally some IT folks, but I couldn't seem to able to find a path that would take me to the next level in this new field. Eventually I took an opportunity to become a .Net developer in a small company which allow me also work closely with database.\n\u003cbr /\u003e\n\u003cbr /\u003e\nI learned as I work, through online tutorial and documentation, books and blogs. Despite the success of replacing company website and internal application with wide acceptance, there wasn't really anyone I could get help or work with on discussing different development approach, performance improvement methods and other things that would potentially reveal a much efficient and effective way.\n\u003cbr /\u003e\n\u003cbr /\u003e\nThings started to change when I discovered user group. .NET user group, C# user group, SQL user group, I attended them all as much as I could. There are so many dedicated and passionate folks on the technology and everyone are there to learn, help and share knowledge. Wow!\n\u003cbr /\u003e\n\u003cbr /\u003e\nAnother best thing I came about is the SQL community. I stumbled upon a few blogs and they blew my mind. Folks at BrentOzar unlimited, awesome people at sqlskills and many many more who actively and voluntarily share and help others.\u003cbr /\u003e\n\u003cbr /\u003e\nOn the awesome community spirit of SQL community, Paul Randal at SQLSkills who recently voted for most desired mentor, is offering to mentor a few people. Personally, I feel that he would be one of the ideal mentor for me. Reasons are some of our similar technical backgrounds, his previous and current involvement as Microsoft employee and manager, community contributor, consultant and a business owner in SQL server field, and last but not least his principles on time management with his 50% traveling and the other half for his family. Although I know a few great resources where I could learn and improve my technical skills, I constantly have questions in some other aspects which keep resurfacing for more decisive direction. Questions like generalist vs specialist, consultant vs employee and others. This may seems like common questions that has been widely addressed, but I am looking into something more specific to my own and some insight from someone in the same field personal experience.\u003cbr /\u003e\n\u003cbr /\u003e\nWhoever he picked as mentee would surely be lucky ones!\u003cbr /\u003e\n\u003cbr /\u003e\nAlthough these guys were not my mentors, I have learned so much from them through their blog, videos and interactions. They not only improve my technical skill set, it changes the way I view work and career advancement. Work is so much fun when everyone share their knowledge and make everyone better. \u003cbr /\u003e\n\u003cbr /\u003e\nLast note, from personal experience, regardless if a person is in school, workforce, or business, having right mentor play an important role and has great impact to the mentee. By sharing his/her personal experience and valuable insight with guidance and resources, often times broaden the vision and expand the horizon for the mentee which he/she would never thought of. On the other side, it is important for the mentee to willing to open for other opinions and to have some idea of his/her goal and what he/she is trying achieve in order for the mentoring process to be effective.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eUpdate\u003c/b\u003e: \u003ca href\u003d\"http://www.sqlskills.com/blogs/paul/mentoring-class-2015/\" target\u003d\"_blank\"\u003eClass of 2015\u003c/a\u003e! I can't imagine a person (busy consulting owner) volunteers to mentor the whole class! But here I present you, Paul Randal!\u003cbr /\u003e\n\u003cbr /\u003e"},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/7965098821287716546/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2015/02/one-important-aspect-mentor.html#comment-form","title":"0 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/7965098821287716546"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/7965098821287716546"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2015/02/one-important-aspect-mentor.html","title":"One Important Aspect - Mentor"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"thr$total":{"$t":"0"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-1527530178142927237"},"published":{"$t":"2015-02-02T06:00:00.000-06:00"},"updated":{"$t":"2015-03-12T15:58:45.735-05:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"PowerShell"},{"scheme":"http://www.blogger.com/atom/ns#","term":"SSIS"}],"title":{"type":"text","$t":"Execute SSIS remotely - PowerShell"},"content":{"type":"html","$t":"\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-YGED4-b83O8/VM8IOdHotFI/AAAAAAAADZ8/YFYGITT0KCE/s1600/remotessis.png\" imageanchor\u003d\"1\" style\u003d\"clear: left; float: left; margin-bottom: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://1.bp.blogspot.com/-YGED4-b83O8/VM8IOdHotFI/AAAAAAAADZ8/YFYGITT0KCE/s1600/remotessis.png\" /\u003e\u003c/a\u003e\u003c/div\u003e\nSince SQL Server 2012, Microsoft introduce a new way of interacting and storing SSIS package. SSIS packages are frequently executed in scheduling format often through SQL Server agent jobs. Today blog post will be focusing on\u0026nbsp;remotely calling SSIS package stored in this new\u0026nbsp;SSISDB Catalog\u003cb\u003e.\u003c/b\u003e\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca name\u003d'more'\u003e\u003c/a\u003eSSISDB Catalog is the new center point of working with these objects including SSIS Projects, Packages and its parameter and environment. These objects are stored in the SSISDB database. The database is automatically created when Integration Service Catalog is created. Interaction with objects stored in SSISDB Catalog is performed mainly through the SSIS Catalog UI or calling the provided stored procedures in SSISDB.\u003cbr /\u003e\n\u003cbr /\u003e\nThere are few methods to remotely call the SSIS packages in SSISDB Catalog.\u0026nbsp;One method is through\u0026nbsp;\u003ca href\u003d\"https://msdn.microsoft.com/en-us/library/microsoft.sqlserver.management.integrationservices.aspx\" target\u003d\"_blank\"\u003eMicrosoft.SqlServer.Management.IntegrationServices\u003c/a\u003e Namespace, or utilize the SSISDB stored procedures. Please beware that executing SSIS stored in\u0026nbsp;msdb, SSIS package store or file system are different.\u003cbr /\u003e\n\u003cbr /\u003e\nThe below PowerShell example utilize the SSISDB stored procedures. It first executes the SSISDB catalog.create_execution stored procedure with all the parameter values to create an instance of execution, and subsequently executes the catalog.start_execution stored procedure to start the particular execution instance just created.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:powershell\"\u003efunction Invoke-SSISPackage {\n    [CmdletBinding(DefaultParametersetName\u003d\"SSPI\")] \n    param(\n        [parameter(Mandatory\u003d$true, Position\u003d0)]\n        [string]$SQLInstance,\n\n        [parameter(ParameterSetName\u003d'User', Mandatory\u003d$true, Position\u003d1)]\n        [string]$User,\n\n        [parameter(ParameterSetName\u003d'User', Mandatory\u003d$true, Position\u003d2)]\n        [string]$password,\n\n        [parameter(ParameterSetName\u003d'SSPI', Mandatory\u003d$true, Position\u003d3)]\n        [switch]$SSPI,\n\n        [parameter(Mandatory\u003d$true, Position\u003d4)]\n        [string]$Folder,\n\n        [parameter(Mandatory\u003d$true, Position\u003d5)]\n        [string]$Project,\n\n        [parameter(Mandatory\u003d$true, Position\u003d6)]\n        [ValidatePattern('^.*\\.dtsx$')]\n        [string]$Package,\n\n        [parameter(Position\u003d7)]\n        [switch]$RunIn32Bit\n    )\n\n    $sqlConn \u003d New-Object System.Data.SqlClient.SqlConnection\n    $sqlConn.ConnectionString \u003d \"Server\u003d$($SQLInstance);`\n                                 Database\u003dSSISDB;`\n                                 User\u003d$($User);`\n                                 Password\u003d$($Password);`\n                                 Integrated Security\u003d$( @{$true\u003d\"SSPI\"; $false\u003d\"False\"}[$SSPI -eq $true] )\" \n    try {\n        $sqlConn.Open()\n\n        Write-Host \"Creating SSIS execution..\" -ForegroundColor \"Yellow\"\n\n        $sqlCmd \u003d New-Object System.Data.SqlClient.SqlCommand (\"[catalog].[create_execution]\", $sqlConn)\n        $sqlCmd.CommandType \u003d [System.Data.CommandType]::StoredProcedure\n        $sqlCmd.Parameters.AddWithValue(\"folder_name\", $Folder) | Out-Null\n        $sqlCmd.Parameters.AddWithValue(\"project_name\", $Project) | Out-Null\n        $sqlCmd.Parameters.AddWithValue(\"package_name\", $Package) | Out-Null\n        $sqlCmd.Parameters.Add(\"use32bitruntime\", [System.Data.SqlDbType]::Bit).Value \u003d $RunIn32Bit.IsPresent\n        $sqlCmd.Parameters.Add(\"execution_id\", [System.Data.SqlDbType]::BigInt).Direction \u003d [System.Data.ParameterDirection]::Output\n        $sqlCmd.ExecuteNonQuery() | Out-Null\n        \n        [int64]$execID \u003d $sqlCmd.Parameters[\"execution_id\"].Value \n        $sqlCmd.Dispose()\n\n        Write-Host \"\"\n        Write-Host \"Starting SSIS execution..\" -ForegroundColor \"Yellow\"\n\n        $sqlCmd \u003d New-Object System.Data.SqlClient.SqlCommand (\"[catalog].[start_execution]\", $sqlConn)\n        $sqlCmd.CommandType \u003d [System.Data.CommandType]::StoredProcedure\n        $sqlCmd.Parameters.AddWithValue(\"execution_id\", $execID) | Out-Null\n        $sqlCmd.ExecuteNonQuery() | Out-Null\n        \n        $sqlCmd.Dispose()\n    }\n    catch {\n        throw\n    }\n    finally {\n        $sqlConn.Dispose()\n    }\n}\n\u003c/pre\u003e\n\u003cbr /\u003e\nAlthough the example above uses PowerShell, implementing it in other programming language such as C#.NET or VB.NET could be easily followed with minor changes. Please note that the script returns result if execution is started successfully, but not if the SSIS completes its execution successfully.\u003cbr /\u003e\n\u003cbr /\u003e\nHope you find the example script useful."},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/1527530178142927237/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2015/02/execute-ssis-remotely-powershell.html#comment-form","title":"0 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/1527530178142927237"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/1527530178142927237"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2015/02/execute-ssis-remotely-powershell.html","title":"Execute SSIS remotely - PowerShell"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"media$thumbnail":{"xmlns$media":"http://search.yahoo.com/mrss/","url":"http://1.bp.blogspot.com/-YGED4-b83O8/VM8IOdHotFI/AAAAAAAADZ8/YFYGITT0KCE/s72-c/remotessis.png","height":"72","width":"72"},"thr$total":{"$t":"0"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-1805727327102455505"},"published":{"$t":"2015-01-07T08:00:00.000-06:00"},"updated":{"$t":"2015-01-07T12:16:32.250-06:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"Backup/Recovery"},{"scheme":"http://www.blogger.com/atom/ns#","term":"Troubleshooting"}],"title":{"type":"text","$t":"SQL Server Database File - Date Modified"},"content":{"type":"html","$t":"\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-Dwj8vpK4iEg/VK1so6LATcI/AAAAAAAADZA/5ybpghH-eU8/s1600/datemodified.png\" imageanchor\u003d\"1\" style\u003d\"clear: left; float: left; margin-bottom: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://3.bp.blogspot.com/-Dwj8vpK4iEg/VK1so6LATcI/AAAAAAAADZA/5ybpghH-eU8/s1600/datemodified.png\" height\u003d\"55\" width\u003d\"200\" /\u003e\u003c/a\u003e\u003c/div\u003e\nRecently, there was a discussion if the date modified of the database files shown in Windows explorer could be used to determine when the database was last used (or recently used). Often, this date is used to determine when a file (eg. word, excel) is last updated. Could that also be applied to SQL Serve database?\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca name\u003d'more'\u003e\u003c/a\u003e\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003c/div\u003e\nLet's do a few tests to see how the date correlate with the database change.\u003cbr /\u003e\n\u003cbr /\u003e\nCreate a database called Testing,\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eCREATE DATABASE Testing\nON PRIMARY\n(\n NAME \u003d 'Testing',\n FILENAME \u003d 'D:\\MSSQL12.MSSQL2014\\MSSQL\\DATA\\Testing.mdf',\n SIZE \u003d 4 MB,\n FILEGROWTH \u003d 8 KB\n) \nLOG ON\n(\n NAME \u003d 'Testing_Log',\n FILENAME \u003d 'D:\\MSSQL12.MSSQL2014\\MSSQL\\DATA\\Testing_log.ldf',\n SIZE \u003d 1 MB\n);\n\u003c/pre\u003e\n\u003cbr /\u003e\nHere is what it look slike with PowerShell Get-ChildItem cmdlet on the database file folder. LastWriteTime as Date Modified as seen in the explorer. Length as the size (in byte) of the file.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003c/div\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-fXvrStLlh4U/VKzIC6q7ZnI/AAAAAAAADWE/8sSkAG5zlIA/s1600/CreateTableAndInsertData.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://1.bp.blogspot.com/-fXvrStLlh4U/VKzIC6q7ZnI/AAAAAAAADWE/8sSkAG5zlIA/s1600/CreateTableAndInsertData.png\" height\u003d\"52\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nDuring the database creation, the size of the database files was specified (eg. 4 MB for the MDF data file). However, that doesn't mean the file has been entirely allocated (or being used). The script below shows the physical size of the database file as well as the size of what actually being used,\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eUSE testing;\n\nSELECT \n CURRENT_TIMESTAMP AS [time],\n f.name [file_name], \n f.size * 8192 [file_size_B], \n FILEPROPERTY(f.name, 'SpaceUsed') * 8192 [Used_size_B]\nFROM sys.database_files f;\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-F-qNIAquxhw/VKzSlpy2ftI/AAAAAAAADYE/X4mNIrhDim8/s1600/InitialUsedSize.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://2.bp.blogspot.com/-F-qNIAquxhw/VKzSlpy2ftI/AAAAAAAADYE/X4mNIrhDim8/s1600/InitialUsedSize.png\" height\u003d\"51\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nNotice that the data file (MDF) used size is around 2.49 MB out of the 4 MB of the data file. Similarly to the log file.\u003cbr /\u003e\n\u003cbr /\u003e\nNow, we going to perform a few task and see what affect the modified date of the database file\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eCreate table (Data Definition Language, DDL)\u003c/b\u003e\u003cbr /\u003e\nThe script below create a table in the database,\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eUSE Testing;\n\nCREATE TABLE dbo.tbl1\n(\n ID int IDENTITY(1,1),\n Col1 nchar(4000) DEFAULT (REPLICATE('G', 4000)),\n Col2 char(49) DEFAULT (REPLICATE('A', 49))\n);\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-vEuCZtSeiiY/VKzIC8oWvUI/AAAAAAAADVg/5U-2zJqX7fw/s1600/CreateTable.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://2.bp.blogspot.com/-vEuCZtSeiiY/VKzIC8oWvUI/AAAAAAAADVg/5U-2zJqX7fw/s1600/CreateTable.png\" height\u003d\"49\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nNotice that the used size of the log file has increased slightly. Let's examine the date modified of the database files.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-fXvrStLlh4U/VKzIC6q7ZnI/AAAAAAAADWE/8sSkAG5zlIA/s1600/CreateTableAndInsertData.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://1.bp.blogspot.com/-fXvrStLlh4U/VKzIC6q7ZnI/AAAAAAAADWE/8sSkAG5zlIA/s1600/CreateTableAndInsertData.png\" height\u003d\"52\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nStill the time where the database was first created for both data and log file.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eInsert Data (Data Manipulation Language, DML)\u003c/b\u003e\u003cbr /\u003e\nSince the database is created on Simple recovery model (Full recovery model without any backup also behave like Simple recovery model) for this demo, the log file is automatically truncated and reused without growing if the transaction size is smaller than the file size. Because of this, we will only focus on the data file from now on.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eINSERT INTO dbo.tbl1\nDEFAULT VALUES;\nGO 2\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-AzphC-R7YQ0/VKzIEVGlezI/AAAAAAAADWI/6Aq2nn3lrPo/s1600/InsertData.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://2.bp.blogspot.com/-AzphC-R7YQ0/VKzIEVGlezI/AAAAAAAADWI/6Aq2nn3lrPo/s1600/InsertData.png\" height\u003d\"50\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-fXvrStLlh4U/VKzIC6q7ZnI/AAAAAAAADWE/8sSkAG5zlIA/s1600/CreateTableAndInsertData.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://1.bp.blogspot.com/-fXvrStLlh4U/VKzIC6q7ZnI/AAAAAAAADWE/8sSkAG5zlIA/s1600/CreateTableAndInsertData.png\" height\u003d\"52\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nThe data file used size has increased to around 2.56 MB. However, date modified of the data file still shown as time when the database first created.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eInsert large data that require file growth\u003c/b\u003e\u003cbr /\u003e\nNow, we are going to insert more data that large enough to require the data file growth.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eINSERT INTO dbo.tbl1\nDEFAULT VALUES;\nGO 207\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-Tfll7dZxq4g/VKzIx5HBFuI/AAAAAAAADXs/HalIJPNK5Ko/s1600/FileGrowSize.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://4.bp.blogspot.com/-Tfll7dZxq4g/VKzIx5HBFuI/AAAAAAAADXs/HalIJPNK5Ko/s1600/FileGrowSize.png\" height\u003d\"49\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-I-a0gboZnjk/VKzIDGMu1hI/AAAAAAAADVw/PcyAjlZijRI/s1600/FileGrowModifiedDate.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://2.bp.blogspot.com/-I-a0gboZnjk/VKzIDGMu1hI/AAAAAAAADVw/PcyAjlZijRI/s1600/FileGrowModifiedDate.png\" height\u003d\"49\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nThe data file has grown to around 4.25 MB (64 KB more from the original size), and the data file modified date is updated.\u003cbr /\u003e\n\u003cbr /\u003e\nThe database is configured to grow (mean auto grow) with 8 KB. Some of you with eagle eye may notice that during the database creation, 8 KB is specified as file growth rate. However, SQL Server grow by minimum an extent (8 data page (8 KB) \u003d 64 KB).\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eModify database file size\u003c/b\u003e\u003cbr /\u003e\nThe script below change the data file size to 10 MB.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eALTER DATABASE Testing\nMODIFY FILE (\n NAME \u003d 'Testing',\n SIZE \u003d 10MB );\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-RkxFC96UlDQ/VKzICNem1DI/AAAAAAAADV4/DMGf24QAk7U/s1600/AlterSize.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://1.bp.blogspot.com/-RkxFC96UlDQ/VKzICNem1DI/AAAAAAAADV4/DMGf24QAk7U/s1600/AlterSize.png\" height\u003d\"47\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-OUGhUvOdAhA/VKzV9AkPZRI/AAAAAAAADYU/qBjLlwh83Mk/s1600/AlterSizeModifiedDate.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://1.bp.blogspot.com/-OUGhUvOdAhA/VKzV9AkPZRI/AAAAAAAADYU/qBjLlwh83Mk/s1600/AlterSizeModifiedDate.png\" height\u003d\"56\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nThe modified date of the data file has been updated.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eShrink database file\u003c/b\u003e\u003cbr /\u003e\nBelow is the script to shrink the data file to 6 MB. Some data was added before the shrink task just to show different used size.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eINSERT INTO dbo.tbl1\nDEFAULT VALUES;\nGO 10\n\nDBCC SHRINKFILE ('Testing', 6);\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-vYGAGNhPawU/VKzIFSa1P_I/AAAAAAAADXI/sZLW7VPEaRA/s1600/ShrinkSize.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://2.bp.blogspot.com/-vYGAGNhPawU/VKzIFSa1P_I/AAAAAAAADXI/sZLW7VPEaRA/s1600/ShrinkSize.png\" height\u003d\"52\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-IQXEOAab1Aw/VKzIFk36QHI/AAAAAAAADWs/dEpWlu1RnSY/s1600/ShrinkSizeModifiedDate.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://3.bp.blogspot.com/-IQXEOAab1Aw/VKzIFk36QHI/AAAAAAAADWs/dEpWlu1RnSY/s1600/ShrinkSizeModifiedDate.png\" height\u003d\"50\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nThe data file modified date reflect the time the shrink action was performed.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eDatabase Backup\u003c/b\u003e\u003cbr /\u003e\nBelow script perform a full backup. Some data is also added prior just for the sake of it.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eINSERT INTO dbo.tbl1\nDEFAULT VALUES;\nGO 10\n\nBACKUP DATABASE Testing\nTO DISK \u003d 'Testing.bak';\u003c/pre\u003e\n\u003cbr /\u003e\nNothing change in data file size and its used size. Data file date modified remain the same as previous time.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eAuto Close\u003c/b\u003e\u003cbr /\u003e\nBy default, this auto_close database setting is set to false. As best practice, this should be false. Turning on this setting make SQL Server close the database after the last disconnected connection to the database. This is not recommended in practice as it takes time to re-obtain resources when it reopen the database.\u003cbr /\u003e\n\u003cbr /\u003e\nThe script below set the Auto_Close setting to true\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eALTER DATABASE Testing\nSET AUTO_CLOSE ON;\n\u003c/pre\u003e\n\u003cbr /\u003e\nIn another windows,\u003cbr /\u003e\nUsing \u003ccode\u003esp_who2\u003c/code\u003e, we identify and kill all the connection to Testing database,\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-ycphFJlPvgc/VK136ALFCcI/AAAAAAAADZc/Rgf2fRFhDnk/s1600/OpenConnection.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://3.bp.blogspot.com/-ycphFJlPvgc/VK136ALFCcI/AAAAAAAADZc/Rgf2fRFhDnk/s1600/OpenConnection.png\" height\u003d\"22\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003c/div\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eKILL 56\nKILL 57\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-DRQCtleTGzg/VKzICOnqMQI/AAAAAAAADWw/YoQp0QXLTSw/s1600/AutoClose.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://3.bp.blogspot.com/-DRQCtleTGzg/VKzICOnqMQI/AAAAAAAADWw/YoQp0QXLTSw/s1600/AutoClose.png\" height\u003d\"54\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nBoth data and log file date modified are updated due to the auto close.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eTaking database offline\u003c/b\u003e\u003cbr /\u003e\nBelow script take the database offline.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eALTER DATABASE Testing\nSET OFFLINE\nWITH ROLLBACK IMMEDIATE;\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-Nt4vfq1sBTI/VKzIGVrZxXI/AAAAAAAADXA/EWipCcuKeko/s1600/TurnOfflineModifiedDate.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://1.bp.blogspot.com/-Nt4vfq1sBTI/VKzIGVrZxXI/AAAAAAAADXA/EWipCcuKeko/s1600/TurnOfflineModifiedDate.png\" height\u003d\"55\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nThe date modified for both data and log file are updated upon this offline action.\u003cbr /\u003e\n\u003cbr /\u003e\nWe do another test to turn the database back online and without any changes, take the database offline.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eALTER DATABASE Testing\nSET ONLINE;\n\nALTER DATABASE Testing\nSET OFFLINE\nWITH ROLLBACK IMMEDIATE;\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-1bVF9TQlK6I/VKzIG90PpdI/AAAAAAAADXQ/6-iKCXI6fHg/s1600/TurnOfflineWithoutNewData.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://2.bp.blogspot.com/-1bVF9TQlK6I/VKzIG90PpdI/AAAAAAAADXQ/6-iKCXI6fHg/s1600/TurnOfflineWithoutNewData.png\" height\u003d\"51\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nThe date modified also updated upon the last database offline action.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eStop SQL Server service\u003c/b\u003e\u003cbr /\u003e\nThe testing database is brought online. Stop the SQL Server service from SQL Server Configuration Manager, both database files date modified are updated.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-J9teZ-6qffU/VKzIGCwOMHI/AAAAAAAADW4/oA7YTqgnoCk/s1600/StopSQLService.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://3.bp.blogspot.com/-J9teZ-6qffU/VKzIGCwOMHI/AAAAAAAADW4/oA7YTqgnoCk/s1600/StopSQLService.png\" height\u003d\"55\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cb\u003eObservation\u003c/b\u003e\u003cbr /\u003e\nThe database files date modified is updated due to these events,\u003cbr /\u003e\n\u003cbr /\u003e\n1) \u003cb\u003ePhysical file growth\u003c/b\u003e - Alter statement, file auto grow\u003cbr /\u003e\n2) \u003cb\u003ePhysical file shrink\u003c/b\u003e - Shrink command (eg. DBCC SHRINKFILE, DBCC SHRINKDABASE), auto shrink (not tested in this demo, but same concept applied)\u003cbr /\u003e\n3) \u003cb\u003eDatabase offline\u003c/b\u003e - Alter statement, SQL Server service\u003cbr /\u003e\n\u003cbr /\u003e\nFor the first two observations, the key word is that only physical file size change will update the date modified. Data records are stored in the data pages within the data file. The database engine continue to allocate new data to unallocated data page in the existing data file until it run out of unallocated page, and request/perform a physical file growth which then trigger the update on date modified property.\u003cbr /\u003e\n\u003cbr /\u003e\nIn this demo, it is not obvious on the log file because the database is behaving as Simple recovery model which it will automatically reuse the virtual log file (smaller logical log file within the log file) without growing if the transaction is small enough within the size of the log file. If it is on Full recovery model, similar date modified concept discussed apply to log file as well.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eConclusion\u003c/b\u003e\u003cbr /\u003e\nThe date modified of the database files are not a good indicator to determine the recency of the database. The database could be actively being used and the date modified is dated a while back. It is possible that database file with plenty of unallocated space within could continue to allocate new data without triggering database file date modified property to be updated for a long time.\u003cbr /\u003e\n\u003cbr /\u003e\nHere are few suggestions without additional setup (eg. trace or audit) to determine if the database is recently used. Each of these method has its pro and con,\u003cbr /\u003e\n\u003cbr /\u003e\n- sp_who2 or sp_whoisactive to see if any current or sleeping connection on the database\u003cbr /\u003e\n- Query the\u0026nbsp;sys.dm_db_index_usage_stats dmv to examine the most recent seek, scan, lookup, and update of database table / index.\u003cbr /\u003e\n- Database backup size from backup history table (msdb.dbo.backupset) to see any size change historically."},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/1805727327102455505/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2015/01/sql-server-database-file-date-modified.html#comment-form","title":"1 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/1805727327102455505"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/1805727327102455505"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2015/01/sql-server-database-file-date-modified.html","title":"SQL Server Database File - Date Modified"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"media$thumbnail":{"xmlns$media":"http://search.yahoo.com/mrss/","url":"http://3.bp.blogspot.com/-Dwj8vpK4iEg/VK1so6LATcI/AAAAAAAADZA/5ybpghH-eU8/s72-c/datemodified.png","height":"72","width":"72"},"thr$total":{"$t":"1"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-5090748117510616511"},"published":{"$t":"2015-01-04T23:30:00.000-06:00"},"updated":{"$t":"2015-01-04T23:31:30.507-06:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"Self Learning"}],"title":{"type":"text","$t":"Microsoft Specialist - Implementing Microsoft Azure Architecture Solutions"},"content":{"type":"html","$t":"I have been using Microsoft Azure with credit from MSDN subscription (Free credit! See the MSDN subscription section on this \u003ca href\u003d\"http://www.travisgan.com/2014/01/sql-server-licensing-test-environment.html\" target\u003d\"_blank\"\u003epost\u003c/a\u003e) for sometimes now. Mostly for VMs and SQL Database (Microsoft Azure SQL Database). Recently I have also started exploring other Azure services like Azure Active Directory for cloud identity and access management, as well as deploying and implementing .NET web application to Azure Websites along with Visual Studio Online.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca name\u003d'more'\u003e\u003c/a\u003eBack in October 2014, Born To Learn (Microsoft Training and Certification Community) blogged about some \u003ca href\u003d\"https://borntolearn.mslearn.net/b/weblog/archive/2014/10/16/get-certified-on-microsoft-azure-or-office-365-with-free-exams\" target\u003d\"_blank\"\u003efree voucher\u003c/a\u003e to get certified for Microsoft Azure (The offer has ended). I was lucky enough to get the free voucher. In addition, there was a week long Azure IaaS for IT Pros Online Event on the first week on December 2014. There were multiple experts sharing some technical insights which could be helpful on this particular certification. If you missed the live event, the videos are also available \u003ca href\u003d\"http://channel9.msdn.com/Events/Microsoft-Azure/Level-Up-Azure-IaaS-for-IT-Pros\" target\u003d\"_blank\"\u003ehere\u003c/a\u003e.\u003cbr /\u003e\n\u003cbr /\u003e\nBelow are the technical tasks measured for the Implementing Microsoft Azure Infrastructure Solutions \u003ca href\u003d\"https://www.microsoft.com/learning/en-us/exam-70-533.aspx\" target\u003d\"_blank\"\u003ecertification\u003c/a\u003e,\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-vhv_IaOSDCg/VKoNPhZffJI/AAAAAAAADUg/EykffFIZa3E/s1600/70-533.PNG\" imageanchor\u003d\"1\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://2.bp.blogspot.com/-vhv_IaOSDCg/VKoNPhZffJI/AAAAAAAADUg/EykffFIZa3E/s1600/70-533.PNG\" height\u003d\"255\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nWith all these great incentive, I thought this might be the great opportunity for me to widen / deepen my understanding on the Microsoft Azure platform like the details of how virtual network is being implemented and different type of storage and its relevant performance.\u003cbr /\u003e\n\u003cbr /\u003e\nAlthough some times I have people downplayed the benefit and validity of certification,\u0026nbsp;I encourage people to get certified.\u0026nbsp;Out of many benefits as listed on this \u003ca href\u003d\"https://www.microsoft.com/learning/en-us/certification-testimonials.aspx\" target\u003d\"_blank\"\u003eMicrosoft learning page\u003c/a\u003e, personally, one of the main reason is while preparing for the exam, I am often exposed to part and piece of some feature of a technology I don't usually work on, and sometimes not aware of. This give me an opportunity to verify my skills as well as to learn new knowledge in depth while exploring it. I always emphasize that passing an exam is not the end, but rather a part of the journey of getting better understanding of a technology.\u003cbr /\u003e\n\u003cbr /\u003e\nI passed the exam and is certified as Microsoft Specialist for Implementing Microsoft Azure Architecture Solutions. Microsoft Azure has been increasing its collection of services with new improvements and features at a very fast pace. So, learning continue, just like \u003ca href\u003d\"http://www.travisgan.com/2013/04/learning-continue.html\"\u003ethis\u003c/a\u003e.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-NoGviGd9H54/VKof4hhEkZI/AAAAAAAADUw/pSH0YwO-_As/s1600/MS_2013(rgb)_2617.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://2.bp.blogspot.com/-NoGviGd9H54/VKof4hhEkZI/AAAAAAAADUw/pSH0YwO-_As/s1600/MS_2013(rgb)_2617.png\" /\u003e\u003c/a\u003e\u003c/div\u003e\n"},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/5090748117510616511/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2015/01/microsoft-specialist-implementing.html#comment-form","title":"0 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/5090748117510616511"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/5090748117510616511"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2015/01/microsoft-specialist-implementing.html","title":"Microsoft Specialist - Implementing Microsoft Azure Architecture Solutions"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"media$thumbnail":{"xmlns$media":"http://search.yahoo.com/mrss/","url":"http://2.bp.blogspot.com/-vhv_IaOSDCg/VKoNPhZffJI/AAAAAAAADUg/EykffFIZa3E/s72-c/70-533.PNG","height":"72","width":"72"},"thr$total":{"$t":"0"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-1519887682847615138"},"published":{"$t":"2014-12-01T10:03:00.001-06:00"},"updated":{"$t":"2014-12-01T10:03:49.166-06:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"SSIS"}],"title":{"type":"text","$t":"SSIS Connection Manager Properties"},"content":{"type":"html","$t":"The SSIS packages stored in SSIS Catalog (introduced in SQL Server 2012, and also SQL Server 2014) allow configuration changes made on the project or the package connection managers, depending how the connection was setup. It is important to have a good understanding of how the values in these properties are being used.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca name\u003d'more'\u003e\u003c/a\u003eI can't find any documentation on how these SSIS connection manager properties affect the actual connection value. In this post, Here is my observation on how these properties changes affect the actual value used during the execution.\u003cbr /\u003e\n\u003cbr /\u003e\nHere is the SSIS connection manager in the designer (in this case Visual Studio with BI)\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-haQI7A4Tm-I/VHY3FGEgspI/AAAAAAAADQg/wa-PXbeWL54/s1600/DesignerSSIS.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://1.bp.blogspot.com/-haQI7A4Tm-I/VHY3FGEgspI/AAAAAAAADQg/wa-PXbeWL54/s1600/DesignerSSIS.png\" height\u003d\"365\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nThis is the connection manager property setting on the deployed package in SQL Server SSIS Catalogs.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-1f1XhZf09mM/VHT6cwjvJNI/AAAAAAAADPU/T_4gQBx2eXE/s1600/connproperties.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://2.bp.blogspot.com/-1f1XhZf09mM/VHT6cwjvJNI/AAAAAAAADPU/T_4gQBx2eXE/s1600/connproperties.png\" height\u003d\"223\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nIt seems the ConnectionString value is generated based on the setting configured at the designer. Other properties (eg. ServerName and InitialCatalog) values appears to be similarly mapped, but not all the property like integrated security as SSPI.\u003cbr /\u003e\n\u003cbr /\u003e\nLooking at this example, the connection string property, along with other properties appear to be serving the similar purpose. Let's look at the detail. Click the '...' on the right of the property show the different options,\u003cbr /\u003e\n\u003cbr /\u003e\n1) Using the default value from package (not editable)\u003cbr /\u003e\n2) Edit value. new value could be entered here. This value is stored in the SQL Server.\u003cbr /\u003e\n3) Use environment value. If environment have been setup for the project, the value of the property could be mapped to the value setup in the environment value.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-XIWGHlfJktU/VHT6JH1aZ_I/AAAAAAAADPI/620yNs8cT-4/s1600/ParameterValue.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://3.bp.blogspot.com/-XIWGHlfJktU/VHT6JH1aZ_I/AAAAAAAADPI/620yNs8cT-4/s1600/ParameterValue.png\" height\u003d\"375\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nLet's perform a few testings to see how changing these properties affect the connection values being used during execution.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003e\u003cu\u003ePreparation\u003c/u\u003e\u003c/b\u003e\u003cbr /\u003e\n\u003cb\u003e\u003cu\u003e\u003cbr /\u003e\u003c/u\u003e\u003c/b\u003e\nI designed a simple SSIS package that obtain the server, database and login information, and store them in a table for examination.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-u-EViGLIfnA/VHY6YSee54I/AAAAAAAADQw/DMW1Byacb2w/s1600/SSISDesign.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://3.bp.blogspot.com/-u-EViGLIfnA/VHY6YSee54I/AAAAAAAADQw/DMW1Byacb2w/s1600/SSISDesign.png\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nFor the Get Session Info task, it query the connection server name, database name and the login.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-fDrgaWf8XBQ/VHY6YTHXxVI/AAAAAAAADQs/H1MGw0gz9a0/s1600/GetInfo.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://1.bp.blogspot.com/-fDrgaWf8XBQ/VHY6YTHXxVI/AAAAAAAADQs/H1MGw0gz9a0/s1600/GetInfo.png\" height\u003d\"191\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nStore Session Info task pretty much store the information into a table.\u003cbr /\u003e\n\u003cbr /\u003e\nOnce the package is ready, it is deployed to the SSIS Catalogs for testing\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cu\u003eFirst Test\u003c/u\u003e\u003cbr /\u003e\nChange the ServerName value to TestServer, the ServerName value font is in\u0026nbsp;\u003cb\u003ebold\u003c/b\u003e. Notice that the Data Source value in the connection string still Server1.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-B8_adpuiYRc/VHT6-aoYMrI/AAAAAAAADPc/RQcSDnyyc90/s1600/servername.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://4.bp.blogspot.com/-B8_adpuiYRc/VHT6-aoYMrI/AAAAAAAADPc/RQcSDnyyc90/s1600/servername.png\" height\u003d\"220\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003cbr /\u003e\u003c/div\u003e\nExecution shows the server used is \u003cb\u003e\u003ci\u003eTestServer\u003c/i\u003e\u003c/b\u003e. Similar test shows that execution pick up the \u003cb\u003eedited\u003c/b\u003e InitialCatalog property (database)\u0026nbsp;\u003cb\u003evalue\u003c/b\u003e as well.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cu\u003eSecond Test\u003c/u\u003e\u003cbr /\u003e\nWith ServerName property value edited as TestServer, now we change the Data Source value of the ConnectionString value to ProdServer.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/--L1EhZ1NvDM/VHT9xjSz_vI/AAAAAAAADPs/1PW_7lQGKQk/s1600/connservername.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://3.bp.blogspot.com/--L1EhZ1NvDM/VHT9xjSz_vI/AAAAAAAADPs/1PW_7lQGKQk/s1600/connservername.png\" height\u003d\"222\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003c/div\u003e\nExecution shows the server used is \u003cb\u003e\u003ci\u003eTestServer\u003c/i\u003e\u003c/b\u003e. In this case, it seems like the ServerName property \u003cb\u003eedited value\u003c/b\u003e takes precedence.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cu\u003eThird Test\u003c/u\u003e\u003cbr /\u003e\nThe original (unedited) connection string is this,\u003cbr /\u003e\nData Source\u003dServer1;Initial Catalog\u003dmaster;Provider\u003dSQLNCLI11.1;\u003cb\u003eIntegrated Security\u003dSSPI\u003c/b\u003e;Auto Translate\u003dFalse;\u003cbr /\u003e\n\u003cbr /\u003e\nNotice the connection string has the SSPI specified.\u003cbr /\u003e\n\u003cbr /\u003e\nChange the UserName property value to TestUser and add a Password to the Password property.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-hP0MsMkkWjc/VHT9xhFMZ4I/AAAAAAAADPo/JZ0_DxMcVbM/s1600/userpasswor.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://4.bp.blogspot.com/-hP0MsMkkWjc/VHT9xhFMZ4I/AAAAAAAADPo/JZ0_DxMcVbM/s1600/userpasswor.png\" height\u003d\"222\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nExecutions shows the SSPI is used. In this case, seems like the SSPI take precedence.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cu\u003eFourth test\u003c/u\u003e\u003cbr /\u003e\nRemove the SSPI, add user and password on the connection string,\u003cbr /\u003e\nData Source\u003dServer1;Initial Catalog\u003dmaster;Provider\u003dSQLNCLI11.1;\u003cb\u003eUser Name\u003dUser1; Password \u003d CrazyPassword;\u003c/b\u003e\u0026nbsp;Auto Translate\u003dFalse;\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-7HXfC3OCvzk/VHT-_25jUlI/AAAAAAAADP4/DKFcfzeGTcY/s1600/connuserpassword.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://2.bp.blogspot.com/-7HXfC3OCvzk/VHT-_25jUlI/AAAAAAAADP4/DKFcfzeGTcY/s1600/connuserpassword.png\" height\u003d\"222\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nExecution shows \u003cb\u003e\u003ci\u003eTestUser\u003c/i\u003e\u003c/b\u003e from User property and its Password value (\u003cb\u003e\u003ci\u003eCrazyPassword\u003c/i\u003e\u003c/b\u003e) are being used. These two properties take precedence when there is no SSPI specified in the connection string property.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003e\u003cu\u003eObservation\u003c/u\u003e\u003c/b\u003e\u003cbr /\u003e\nIt appears that the \u003cb\u003eedited\u003c/b\u003e Server, InitialCatalog, UserName and Passoword property values take precedence over the values in ConnectionString. However, SSPI takes precedence wherever it is specified in the connection regardless of the value of the UserName and Password edited value specified in UserName and Password property.\u003cbr /\u003e\n\u003cbr /\u003e\nBased on this observation, if these properties value are different from the designer value (default value), we could change the ConnectionString property value to reflect the change, or we could also change these value directly and possibly clearer on the respective property. When I say directly, I mean changing a specific property (eg. only the server value), without changing the entire connection string. Apparently this is when SSPI is not specified.\u003cbr /\u003e\n\u003cbr /\u003e\nThis also applicable when using environment variable. If we previously have an environment setup with its variables defined.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-RehZkn93vDI/VHY-xh7m5qI/AAAAAAAADRA/j3_hf3vPxO4/s1600/EnvVariables.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://4.bp.blogspot.com/-RehZkn93vDI/VHY-xh7m5qI/AAAAAAAADRA/j3_hf3vPxO4/s1600/EnvVariables.png\" height\u003d\"125\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nWe could assign the ServerName property to use the environment value. Notice the property value now is \u003cu\u003eunderlined\u003c/u\u003e.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-6ZwrWlFyz4M/VHY-xlH5thI/AAAAAAAADRE/vWaVPsVyfaw/s1600/ConnPropertiesEnv.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://1.bp.blogspot.com/-6ZwrWlFyz4M/VHY-xlH5thI/AAAAAAAADRE/vWaVPsVyfaw/s1600/ConnPropertiesEnv.png\" height\u003d\"236\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nThis is often the case when developer design the SSIS with connection pointing to developer database servers (sometimes different database as well), and these properties need to be modified when the packages are deployed into production environment that are to be pointed to production servers and databases, with different user account credential. With this clarification, we know where and how to change the connection configuration correctly and efficiently."},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/1519887682847615138/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2014/12/ssis-connection-manager-properties.html#comment-form","title":"0 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/1519887682847615138"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/1519887682847615138"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2014/12/ssis-connection-manager-properties.html","title":"SSIS Connection Manager Properties"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"media$thumbnail":{"xmlns$media":"http://search.yahoo.com/mrss/","url":"http://1.bp.blogspot.com/-haQI7A4Tm-I/VHY3FGEgspI/AAAAAAAADQg/wa-PXbeWL54/s72-c/DesignerSSIS.png","height":"72","width":"72"},"thr$total":{"$t":"0"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-7461132448512924440"},"published":{"$t":"2014-06-12T13:16:00.001-05:00"},"updated":{"$t":"2014-06-13T09:14:53.645-05:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"Troubleshooting"}],"title":{"type":"text","$t":"Defunct File Cause SQL Server Upgrade Failure"},"content":{"type":"html","$t":"\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-u69F4Hh-bwo/U5noSvO8FOI/AAAAAAAADI0/ju_GCQKdCLc/s1600/upgradebroken.jpg\" imageanchor\u003d\"1\" style\u003d\"clear: left; float: left; margin-bottom: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://2.bp.blogspot.com/-u69F4Hh-bwo/U5noSvO8FOI/AAAAAAAADI0/ju_GCQKdCLc/s1600/upgradebroken.jpg\" height\u003d\"100\" width\u003d\"125\" /\u003e\u003c/a\u003e\u003c/div\u003e\nIn the previous \u003ca href\u003d\"http://www.travisgan.com/2014/06/sql-server-defunct-filegroup-and.html\" target\u003d\"_blank\"\u003epost\u003c/a\u003e, we discussed how a data file or its filegroup becomes defunct. As making data file defunct may be desirable in certain situation to resolve immediate issue and allow the database continue to be operational, you may encounter some surprise problem when you are ready to move up to the next version during the SQL Server upgrade.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca name\u003d'more'\u003e\u003c/a\u003eAs shown in the previous post that in a situation where a data file is missing or could not be brought online, there is no backup available and has been determined that the data resides in the data file is no longer needed and does not need to be restored, one way to resolve this is making the file/filegroup defunct allows certain database operations like backup to be completed successfully.\u003cbr /\u003e\n\u003cbr /\u003e\nWarning. The only way to bring back offline or defunct file online is restore from a backup that was taken prior to making the data file offline/defunct.\u003cbr /\u003e\n\u003cbr /\u003e\nAs making a data file/filegroup defunct resolve immediate issue, however it may generate problem during SQL Server upgrade. Let's see some example. First we create a database with primary and secondary filegroup on a SQL Server 2008 instance,\u003cbr /\u003e\n\u003cbr /\u003e\nNote: If you have performed the steps as in the previous blog \u003ca href\u003d\"http://www.travisgan.com/2014/06/sql-server-defunct-filegroup-and.html\" target\u003d\"_blank\"\u003epost\u003c/a\u003e, you may need to delete the example database (testdb), its data files and backups for the steps below.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eUSE master;\nGO\n\n--Create database\nCREATE DATABASE testdb\nON PRIMARY\n( NAME \u003d testdb_data, \n FILENAME \u003d 'D:\\MSSQL10_50.MSSQL2008R2\\MSSQL\\Data\\testdb.mdf' ),\nFILEGROUP FG1\n( NAME \u003d testdb_file1, \n FILENAME \u003d 'D:\\MSSQL10_50.MSSQL2008R2\\MSSQL\\Data\\testdb_file1.ndf' )\nLOG ON \n( NAME \u003d testdb_log, \n FILENAME \u003d 'D:\\MSSQL10_50.MSSQL2008R2\\MSSQL\\Data\\testdb_log.ldf' );\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\nCreates a table on Primary filegroup and index on the secondary filegroup\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eUSE testdb;\nGO\n\n--Create table on primary filegroup\nCREATE TABLE tbl1\n(\n col1 int IDENTITY (1,1) CONSTRAINT PK_tbl1_col1 PRIMARY KEY,\n col2 varchar(50)\n);\nGO\n\n--Create index on FG1 filegroup\nCREATE INDEX IX_tbl1_col2\nON tbl1\n(\n col2\n) ON FG1;\nGO\n\n--Insert data into table and index\nINSERT INTO tbl1\nVALUES ('ABC');\nGO\u003c/pre\u003e\n\u003cbr /\u003e\nAt the moment, the database has all the data file online. Let's backup the database for later use.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003e--Backup for later use\nBACKUP DATABASE testdb\nTO DISK \u003d 'testdb_good_backup.bak';\u003c/pre\u003e\n\u003cbr /\u003e\nMake the secondary data file defunct.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003e--Make the data file defunct\nALTER DATABASE testdb\nMODIFY FILE \n( NAME \u003d testdb_file1,\n OFFLINE );\nGO\n\nALTER DATABASE testdb\nREMOVE FILEGROUP FG1;\nGO\u003c/pre\u003e\n\u003cbr /\u003e\nThis query below shows that the data file is now in defunct state\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eSELECT \n f.name file_group, \n d.name file_name, \n d.state_desc file_state\nFROM sys.filegroups f\nJOIN sys.database_files d\n ON f.data_space_id \u003d d.data_space_id;\nGO\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-6hKBDMjqdT8/U5kzXhJzpoI/AAAAAAAADIU/JcJltwPo74g/s1600/file_defunct.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://4.bp.blogspot.com/-6hKBDMjqdT8/U5kzXhJzpoI/AAAAAAAADIU/JcJltwPo74g/s1600/file_defunct.png\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nTake a backup for the database with the defunct file\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eBACKUP DATABASE testdb\nTO DISK \u003d 'testdb_defunct_backup.bak';\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\nIn the newer version of SQL Server instance, in this case SQL Server 2014, restore the database from the backup with the defunct file.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eUSE master; \nGO\n\nRESTORE DATABASE testdb\nFROM DISK \u003d 'D:\\MSSQL10_50.MSSQL2008R2\\MSSQL\\Backup\\testdb_defunct_backup.bak'\nWITH \n MOVE 'testdb_data' TO 'D:\\MSSQL12.MSSQL2014\\MSSQL\\Data\\testdb.mdf',\n MOVE 'testdb_file1' TO 'D:\\MSSQL12.MSSQL2014\\MSSQL\\Data\\testdb_file1.ndf',\n MOVE 'testdb_log' TO 'D:\\MSSQL12.MSSQL2014\\MSSQL\\Data\\testdb_log.ldf';\nGO \n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-meEOA4PScnA/U5kzoOgA-AI/AAAAAAAADIc/w4dD2MvenoU/s1600/defunct_restore_success.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://1.bp.blogspot.com/-meEOA4PScnA/U5kzoOgA-AI/AAAAAAAADIc/w4dD2MvenoU/s1600/defunct_restore_success.png\" height\u003d\"145\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nThe metadata of the index in defunct file doesn't affect the restore process.The database restored and upgraded successfully. Of course it will return error if the index is queried on the new version instance, just like in the original SQL Server instance.\u003cbr /\u003e\n\u003cbr /\u003e\nLet's take another example. This time with the full text index. First, restore the database back to the status before the secondary data file becomes defunct. We will restore from the good backup file.\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eUSE master;\nGO\n\nALTER DATABASE testdb\nSET SINGLE_USER\nWITH ROLLBACK IMMEDIATE;\nGO\n\n--Restore the database back to prior defunct\nRESTORE DATABASE testdb\nFROM DISK \u003d 'testdb_good_backup.bak'\nWITH REPLACE;\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\nCreate fulltext catalog and fulltext index on filegroup FG1\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eUSE testdb;\nGO\n\n--Create fulltext catalog\nCREATE FULLTEXT CATALOG catalog1 AS DEFAULT;\nGO\n\n--Create fulltext index on filegroup FG1\nCREATE FULLTEXT INDEX ON tbl1 (col2)\nKEY INDEX PK_tbl1_col1\nON (FILEGROUP FG1);\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\nNow make the data file defunct like how it was done previously.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003e-Make the data file defunct\nALTER DATABASE testdb\nMODIFY FILE \n( NAME \u003d testdb_file1,\n OFFLINE );\nGO\n\nALTER DATABASE testdb\nREMOVE FILEGROUP FG1;\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\nTake another backup. The backup consist of defunct data file with fulltext index.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eBACKUP DATABASE testdb\nTO DISK \u003d 'testdb_defunct_fulltext_backup.bak';\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\nIn the new version of SQL Server instance, drop the testdb database and restore from the backup just taken.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eUSE master; \nGO\n\nALTER DATABASE testdb\nSET SINGLE_USER\nWITH ROLLBACK IMMEDIATE;\nGO\n\nDROP DATABASE testdb;\nGO\n\nRESTORE DATABASE testdb\nFROM DISK \u003d 'D:\\MSSQL10_50.MSSQL2008R2\\MSSQL\\Backup\\testdb_defunct_fulltext_backup.bak'\nWITH \n MOVE 'testdb_data' TO 'D:\\MSSQL12.MSSQL2014\\MSSQL\\Data\\testdb.mdf',\n MOVE 'testdb_file1' TO 'D:\\MSSQL12.MSSQL2014\\MSSQL\\Data\\testdb_file1.ndf',\n MOVE 'testdb_log' TO 'D:\\MSSQL12.MSSQL2014\\MSSQL\\Data\\testdb_log.ldf';\nGO \n\u003c/pre\u003e\n\u003cbr /\u003e\nThe restore operation failed with error.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ccode\u003e\nProcessed 184 pages for database 'testdb', file 'testdb_data' on file 1.\u0026nbsp;\u003c/code\u003e\u003cbr /\u003e\n\u003ccode\u003eProcessed 0 pages for database 'testdb', file 'testdb_file1' on file 1.\u0026nbsp;\u003c/code\u003e\u003cbr /\u003e\n\u003ccode\u003eProcessed 7 pages for database 'testdb', file 'testdb_log' on file 1.\u0026nbsp;\u003c/code\u003e\u003cbr /\u003e\n\u003ccode\u003eConverting database 'testdb' from version 661 to the current version 782.\u0026nbsp;\u003c/code\u003e\u003cbr /\u003e\n\u003ccode\u003eDatabase 'testdb' running the upgrade step from version 661 to version 668.\u0026nbsp;\u003c/code\u003e\u003cbr /\u003e\n\u003ccode\u003eDatabase 'testdb' running the upgrade step from version 668 to version 669.\u0026nbsp;\u003c/code\u003e\u003cbr /\u003e\n\u003ccode\u003eDatabase 'testdb' running the upgrade step from version 669 to version 670.\u0026nbsp;\u003c/code\u003e\u003cbr /\u003e\n\u003ccode\u003eDatabase 'testdb' running the upgrade step from version 670 to version 671.\u0026nbsp;\u003c/code\u003e\u003cbr /\u003e\n\u003ccode\u003eDatabase 'testdb' running the upgrade step from version 671 to version 672.\u0026nbsp;\u003c/code\u003e\u003cbr /\u003e\n\u003ccode\u003e\u003cspan style\u003d\"color: red;\"\u003eMsg 667, Level 16, State 1, Line 24\nThe index \"\" for table \"\" (RowsetId 72057594039369728) resides on a filegroup (\"FG1\") \nthat cannot be accessed because it is offline, is being restored, or is defunct.\u0026nbsp;\u003c/span\u003e\u003c/code\u003e\u003cbr /\u003e\n\u003ccode\u003e\u003cspan style\u003d\"color: red;\"\u003eMsg 3013, Level 16, State 1, Line 24\u0026nbsp;\u003c/span\u003e\u003c/code\u003e\u003cbr /\u003e\n\u003ccode\u003e\u003cspan style\u003d\"color: red;\"\u003eRESTORE DATABASE is terminating abnormally.\n\u003c/span\u003e\u003c/code\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nand also this error in Windows application log and SQL Server log,\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ccode\u003eDuring upgrade, database raised exception 3602, severity 25, state 53, address 000007FED2F36365. Use the exception number to determine the cause.\u003c/code\u003e\u003cbr /\u003e\n\u003cbr /\u003e\nIf we pay closer attention, we can see the the restore step completed successfully. The failure is during the conversion (upgrade) step.\u003cbr /\u003e\n\u003cbr /\u003e\nWith the query below on the original SQL Server instance, it shows that the object is one of the ifts (integrated full text search) internal table.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eSELECT \n p.partition_id,\n o.name [object_name], \n o.type_desc,\n f.name [filegroup]\nFROM sys.allocation_units a\nJOIN sys.partitions p\n ON a.container_id \u003d p.hobt_id\nJOIN sys.filegroups f\n ON a.data_space_id \u003d f.data_space_id\nJOIN sys.objects o\n ON o.object_id \u003d p.object_id\nWHERE p.partition_id \u003d '72057594039369728'\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-5UPi_6bOAgw/U5nnZ1tJ_yI/AAAAAAAADIs/FEHD6lsU1rw/s1600/jfts.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://2.bp.blogspot.com/-5UPi_6bOAgw/U5nnZ1tJ_yI/AAAAAAAADIs/FEHD6lsU1rw/s1600/jfts.png\" height\u003d\"50\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cb\u003eObservation\u003c/b\u003e\u003cbr /\u003e\nAs the data file becomes defunct, metadata of the objects in the defunct data file retained. SQL Server upgrade process attempts to upgrade objects (eg. fulltext index) to newer version including the ones in the defunct data file (as the metadata is retained), it leads to the upgrade failure. However, even the fulltext index was determined as no longer needed, it couldn't be removed as the data file is in defunct state and on the other hand it causes error during the upgrade. In this predicament, the only solution appears to be extract everything (minus the ones in defunct data file) from the existing database to a new database.\u003cbr /\u003e\n\u003cbr /\u003e\nThis issue is observed on upgrade from SQL Server 2008 to SQL Server 2012 and SQL Server 2014. Restore from SQL Server 2008 to SQL Server 2008 R2 doesn't have any problem as there is no upgrade step involved during the restore operation between these two versions."},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/7461132448512924440/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2014/06/defunct-file-cause-sql-server-upgrade.html#comment-form","title":"0 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/7461132448512924440"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/7461132448512924440"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2014/06/defunct-file-cause-sql-server-upgrade.html","title":"Defunct File Cause SQL Server Upgrade Failure"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"media$thumbnail":{"xmlns$media":"http://search.yahoo.com/mrss/","url":"http://2.bp.blogspot.com/-u69F4Hh-bwo/U5noSvO8FOI/AAAAAAAADI0/ju_GCQKdCLc/s72-c/upgradebroken.jpg","height":"72","width":"72"},"thr$total":{"$t":"0"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-6236161335080378656"},"published":{"$t":"2014-06-06T10:17:00.000-05:00"},"updated":{"$t":"2014-06-11T22:28:16.481-05:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"Backup/Recovery"}],"title":{"type":"text","$t":"SQL Server - Defunct Filegroup and Defunct Data File"},"content":{"type":"html","$t":"\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-F6BqZtdp2EU/U5HZ-dDEf3I/AAAAAAAADH0/hTZTUGz2eOY/s1600/defunct.gif\" imageanchor\u003d\"1\" style\u003d\"clear: left; float: left; margin-bottom: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://1.bp.blogspot.com/-F6BqZtdp2EU/U5HZ-dDEf3I/AAAAAAAADH0/hTZTUGz2eOY/s1600/defunct.gif\" height\u003d\"90\" width\u003d\"120\" /\u003e\u003c/a\u003e\u003c/div\u003e\nA data file becomes DEFUNCT when its respective filegroup is removed or when the data file or its filegroup is not included during the piecemeal restore in simple recovery model. Usually a filegroup can not be removed if the data file is not empty. However, a filegroup could be removed if one of its data file is not online. This post illustrates how a data file becomes DEFUNCT.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca name\u003d'more'\u003e\u003c/a\u003eLet's create a database and show how a data file becomes DEFUNCT.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eUSE master;\nGO\n\n--Create database\nCREATE DATABASE testdb\nON PRIMARY\n( NAME \u003d testdb_data, \n FILENAME \u003d 'D:\\MSSQL10_50.MSSQL2008R2\\MSSQL\\Data\\testdb.mdf' ),\nFILEGROUP FG1\n( NAME \u003d testdb_file1, \n FILENAME \u003d 'D:\\MSSQL10_50.MSSQL2008R2\\MSSQL\\Data\\testdb_file1.ndf' ),\n( NAME \u003d testdb_file2, \n FILENAME \u003d 'D:\\MSSQL10_50.MSSQL2008R2\\MSSQL\\Data\\testdb_file2.ndf' )\nLOG ON \n( NAME \u003d testdb_log, \n FILENAME \u003d 'D:\\MSSQL10_50.MSSQL2008R2\\MSSQL\\Data\\testdb_log.ldf' );\nGO\n\n--Set database to simple recovery model\nALTER DATABASE testdb\nSET RECOVERY SIMPLE;\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\nThe query above creates a database with the PRIMARY filegroup and FG1 secondary filegroup. The PRIMARY filegroup has one data file (mdf) and FG1 filegroup has two data files (ndf). The database is also set to simple recovery model.\u003cbr /\u003e\n\u003cbr /\u003e\nNext, we creates some data.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eUSE testdb;\nGO\n\n--Create table on primary filegroup\nCREATE TABLE tbl1\n(\n col1 int IDENTITY (1,1) CONSTRAINT PK_tbl1_col1 PRIMARY KEY,\n col2 char(800)\n);\nGO\n\n--Create index on FG1 filegroup\nCREATE INDEX IX_tbl1_col2\nON tbl1\n(\n col2\n) ON FG1;\nGO\n\n--Insert enough data so both secondary files in FG1 are filled with some data\nINSERT INTO tbl1\nVALUES ('A');\nGO 120\n\u003c/pre\u003e\n\u003cbr /\u003e\nThe query above created the table is created on PRIMARY filegroup, while its index is created on FG1 filegroup.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eSELECT \n o.name tbl_name,\n i.name index_name,\n i.type_desc,\n f.name fg_name\nFROM sys.objects o\nJOIN sys.indexes i\n ON o.object_id \u003d i.object_id\nJOIN sys.filegroups f\n ON i.data_space_id \u003d f.data_space_id\nWHERE o.is_ms_shipped \u003d 0;\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003c/div\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-XG0WYM_oR5o/U5Sj2JgcjpI/AAAAAAAADII/cmzvtD4E3tQ/s1600/tbl_filegroup.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://4.bp.blogspot.com/-XG0WYM_oR5o/U5Sj2JgcjpI/AAAAAAAADII/cmzvtD4E3tQ/s1600/tbl_filegroup.png\" height\u003d\"77\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nThe query above verified which filegroup the data is stored at. Now we try to make the data file in FG1 defunct. Let's first take a good backup for later use.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eBACKUP DATABASE testdb\nTO DISK \u003d 'testdb_good_backup.bak';\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\nNow, takes the first secondary file (testdb_file1) offline.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eALTER DATABASE testdb\nMODIFY FILE \n( NAME \u003d testdb_file1,\n OFFLINE );\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\nWarning. Please beware that the only way to bring the data file back online is restore from a backup.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"sql:brush\"\u003eSELECT \n f.name file_group, \n d.name file_name, \n d.state_desc file_state\nFROM sys.filegroups f\nJOIN sys.database_files d\n ON f.data_space_id \u003d d.data_space_id;\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-dX3CjInL4-o/U5DiuSPS-zI/AAAAAAAADHc/9aO5fDJC9ro/s1600/datafile_status.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://2.bp.blogspot.com/-dX3CjInL4-o/U5DiuSPS-zI/AAAAAAAADHc/9aO5fDJC9ro/s1600/datafile_status.png\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nThe query above shows that the testdb_file1 data file is now offline. That also makes the FG1 filegroup offline as well. Any attempt to query data located on the offline filegroup returns error. The query below force the query to use the index located in the offline filegroup.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eSELECT col2\nFROM dbo.tbl1 \nWITH (INDEX(IX_tbl1_col2));\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003ccode\u003e\u003cspan style\u003d\"color: red;\"\u003e\nMsg 315, Level 16, State 1, Line 56\nIndex \"IX_tbl1_col2\" on table \"dbo.tbl1\" (specified in the FROM clause) \nis disabled or resides in a filegroup which is not online.\u003c/span\u003e\u003c/code\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nWhat about backing up the database at this moment?\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eBACKUP DATABASE testdb\nTO DISK \u003d 'testdb_backup.bak';\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003ccode\u003e\u003cspan style\u003d\"color: red;\"\u003eMsg 3007, Level 16, State 1, Line 115\u003cbr /\u003e\nThe backup of the file or filegroup \"testdb_file1\" is not permitted because it is not online. \nContainer state: \"Offline\" (7). Restore status: 0. BACKUP can be performed by using the FILEGROUP \nor FILE clauses to restrict the selection to include only online data.\u003cbr /\u003e\nMsg 3013, Level 16, State 1, Line 115\u0026nbsp;\u003c/span\u003e\u003c/code\u003e\u003cbr /\u003e\n\u003ccode\u003e\u003cspan style\u003d\"color: red;\"\u003eBACKUP DATABASE is terminating abnormally.\u003c/span\u003e\u003c/code\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nThe only way to bring the data file back online is restore the database from the backup. You should always do that. However, if there is no backup and it has been determined that data in the offline filegroup is not needed, one way to allow backup is to make the filegroup defunct. That will remove the filegroup from the database but retain the metadata.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eALTER DATABASE testdb\nREMOVE FILEGROUP FG1;\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\nNow if we check the status of the data files.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eSELECT \n f.name file_group, \n d.name file_name, \n d.state_desc file_state\nFROM sys.filegroups f\nJOIN sys.database_files d\n ON f.data_space_id \u003d d.data_space_id;\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-v_t0sdB7N7I/U5Di0qJeGJI/AAAAAAAADHk/B2pVemS5Sc8/s1600/defunct_file.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://4.bp.blogspot.com/-v_t0sdB7N7I/U5Di0qJeGJI/AAAAAAAADHk/B2pVemS5Sc8/s1600/defunct_file.png\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nEven the testdb_file2 secondary file is online and is not empty, the FG1 filegroup removal completed successfully. Notice that both secondary files in FG1 filegroup are now in DEFUNCT state. Query against data located in this filegroup returns same error.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eBACKUP DATABASE testdb\nTO DISK \u003d 'testdb_defunct_backup.bak';\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\nBackup could be performed successful now with the understanding the data in the FG1 filegroup is no longer available.\n\u003cbr /\u003e\n\u003cbr /\u003e\nAs mentioned earlier, another way data files become DEFUNCT is during the piecemeal restore for database in simple recovery model. In this case, we restore the good testdb backup (testdb_good_backup.bak) taken earlier and restore only the primary filegroup to a new database (testdb_copy).\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eRESTORE DATABASE testdb_copy\nFILEGROUP \u003d 'PRIMARY'\nFROM DISK \u003d 'testdb_good_backup.bak'\nWITH \n MOVE 'testdb_data' TO 'D:\\MSSQL10_50.MSSQL2008R2\\MSSQL\\Data\\testdb_copy.mdf',\n MOVE 'testdb_log' TO 'D:\\MSSQL10_50.MSSQL2008R2\\MSSQL\\Data\\testdb_copy_log.ldf',\n PARTIAL;\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\n\u003ccode\u003e\nProcessed 184 pages for database 'testdb_copy', file 'testdb_data' on file 1.\nProcessed 2 pages for database 'testdb_copy', file 'testdb_log' on file 1.\u0026nbsp;\u003c/code\u003e\u003cbr /\u003e\n\u003ccode\u003e\u003cspan style\u003d\"color: red;\"\u003eMsg 3127, Level 16, State 1, Line 75\u0026nbsp;\u003c/span\u003e\u003c/code\u003e\u003cbr /\u003e\n\u003ccode\u003e\u003cspan style\u003d\"color: red;\"\u003eThe file 'testdb_file1' of restored database 'testdb_copy' is being left in the defunct state because the database is using the simple recovery model and the file is marked for read-write access. Therefore, only read-only files can be recovered by piecemeal restore.\u0026nbsp;\u003c/span\u003e\u003c/code\u003e\u003cbr /\u003e\n\u003ccode\u003e\u003cspan style\u003d\"color: red;\"\u003eMsg 3127, Level 16, State 1, Line 75\u0026nbsp;\u003c/span\u003e\u003c/code\u003e\u003cbr /\u003e\n\u003ccode\u003e\u003cspan style\u003d\"color: red;\"\u003eThe file 'testdb_file2' of restored database 'testdb_copy' is being left in the defunct state because the database is using the simple recovery model and the file is marked for read-write access. Therefore, only read-only files can be recovered by piecemeal restore.\u0026nbsp;\u003c/span\u003e\u003c/code\u003e\u003cbr /\u003e\n\u003ccode\u003eRESTORE DATABASE ... FILE\u003d\u003cname\u003e successfully processed 186 pages in 0.049 seconds (29.595 MB/sec).\n\u003c/name\u003e\u003c/code\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nRunning the below script shows the two secondary files in FG1 filegroup in DEFUNCT state.\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eUSE testdb_copy;\nGO\n\nSELECT \n f.name file_group, \n d.name file_name, \n d.state_desc file_state\nFROM sys.filegroups f\nJOIN sys.database_files d\n ON f.data_space_id \u003d d.data_space_id;\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\nNote that the new database is piecemeal restore from a good complete backup. Full restore from the backup (testdb_defunct_backup.bak) taken after the defunct files creates database with the secondary files in DEFUNCT state, just like how it was.\u003cbr /\u003e\n\u003cbr /\u003e\nSummary, data file becomes DEFUNCT when its filegroup is removed when one of its data file is not online. This action may be intentionally taken to allow tasks like backup to perform successfully when the offline file could not be found and there is no backup and more importantly, the data in the filegroup has been determined not needed or restored. Another situation is during piecemeal restore for database in simple recovery model that left the non-recovered file or filegroup in DEFUNCT state.\u003cbr /\u003e\n\u003cbr /\u003e\nDEFUNCT data files affects SQL Server upgrade in some cases. More detail in next post."},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/6236161335080378656/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2014/06/sql-server-defunct-filegroup-and.html#comment-form","title":"2 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/6236161335080378656"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/6236161335080378656"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2014/06/sql-server-defunct-filegroup-and.html","title":"SQL Server - Defunct Filegroup and Defunct Data File"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"media$thumbnail":{"xmlns$media":"http://search.yahoo.com/mrss/","url":"http://1.bp.blogspot.com/-F6BqZtdp2EU/U5HZ-dDEf3I/AAAAAAAADH0/hTZTUGz2eOY/s72-c/defunct.gif","height":"72","width":"72"},"thr$total":{"$t":"2"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-5209260195598265989"},"published":{"$t":"2014-04-15T06:00:00.000-05:00"},"updated":{"$t":"2014-04-15T10:42:15.459-05:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"sqlcmd"},{"scheme":"http://www.blogger.com/atom/ns#","term":"SQL Agent"}],"title":{"type":"text","$t":"SQL Server Agent Job and SQLCMD"},"content":{"type":"html","$t":"Have you ever encountered missing database backup or maintenance plan didn't complete as it was scheduled, especially error was encountered? This post discusses a behavior difference between SQL Server agent using T-SQL type and SQLCMD with CmdExec type.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ca name\u003d'more'\u003e\u003c/a\u003eSQL Server agent job is widely used to execute scheduled SQL Server tasks like backup, maintenance, SQL stored procedures or scripts as well as non-SQL tasks like executing program.\u003cbr /\u003e\n\u003cbr /\u003e\nWhen creating a SQL Server job, a job step(s) define the functionality or task it performs. SQL Server job step provides different \u003ca href\u003d\"http://technet.microsoft.com/en-us/library/ms189237.aspx#Security\" target\u003d\"_blank\"\u003etypes (subsystems)\u003c/a\u003e. Here listed some of the commonly used types (subsystems),\u003cbr /\u003e\n\u003cbr /\u003e\nTransact-SQL script (T-SQL)\u003cbr /\u003e\nOperating System (CmdExec)\u003cbr /\u003e\nPowerShell\u003cbr /\u003e\nSQL Server Analysis Service (SSAS) command / query\u003cbr /\u003e\nSQL Server Integration Services (SSIS) package\u003cbr /\u003e\n\u003cbr /\u003e\nT-SQL type is the most common job step type. It is used to execute T-SQL Script or stored procedures. CmdExec is used to run program, utility. batch files and etc. PowerShell type is used to execute PowerShell script within the job step command. SSAS type is used to execute SSAS XMLA script to backup SSAS or batch processing. SSIS type is used to execute SSIS package including maintenance plan created through SSMS.\u003cbr /\u003e\n\u003cbr /\u003e\nLet also discuss briefly about the SQLCMD utility. SQLCMD utility allows us to run T-SQL statement in command prompt. This nice little utility allows administrator to perform SQL Server execution without bringing up the SSMS. Often this tool is used for quick admin check or some automated tasks.\u003cbr /\u003e\n\u003cbr /\u003e\nNow back to SQL Agent job. When executing T-SQL command or stored procedure (SP) through SQL Server Agent, most of time the T-SQL type in the job step is selected. This type is best for most of the functionality. However, there is one behavior executing T-SQL or SP this way when an error is encountered during the execution. Let's examine. Here is a simple example,\u003cbr /\u003e\n\u003cbr /\u003e\nCreate a stored procedure to perform backup for multiple databases.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eUSE [master];\nGO\n\nCREATE PROCEDURE dbo.backupsp\nAS\nBACKUP DATABASE [msdb] TO DISK \u003d 'msdb.bak';\nBACKUP DATABASE [nono] TO DISK \u003d 'nono.bak';\nBACKUP DATABASE [master] TO DISK \u003d 'master.bak';\nGO\n\u003c/pre\u003e\n\u003cbr /\u003e\nThere is no 'nono' database. The second backup statement should throw an error.\u003cbr /\u003e\n\u003cbr /\u003e\nNow we create a SQL Agent job with job step type T-SQL to execute this stored procedure.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eUSE [msdb]\nGO\n\nEXEC dbo.sp_add_job\n  @job_name\u003dN'Test Job (TSQL)',\n  @enabled\u003d1;\n\nEXEC dbo.sp_add_jobstep\n  @job_name\u003dN'Test Job (TSQL)',\n  @step_name\u003dN'Test Step (TSQL)',\n  @subsystem\u003dN'TSQL',\n  @command\u003dN'EXEC [master].dbo.backupsp;',\n  @flags\u003d4;\n\nEXEC dbo.sp_add_jobserver\n  @job_name\u003dN'Test Job (TSQL)', @server_name \u003d N'(LOCAL)';\nGO\u003c/pre\u003e\n\u003cbr /\u003e\nStart the job,\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eEXEC dbo.sp_start_job @job_name \u003d N'Test Job (TSQL)';\nGO\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cb\u003eSQL Server 2014 and SQL Server 2012 job step detail\u003c/b\u003e\u003cbr /\u003e\n\u003ccode\u003eExecuted as user: NT SERVICE\\SQLSERVERAGENT. Processed 1728 pages for database 'msdb', file 'MSDBData' on file 3. [SQLSTATE 01000] (Message 4035) \u0026nbsp;Processed 7 pages for database 'msdb', file 'MSDBLog' on file 3. [SQLSTATE 01000] (Message 4035) \u0026nbsp;BACKUP DATABASE successfully processed 1735 pages in 3.124 seconds (4.337 MB/sec). [SQLSTATE 01000] (Message 3014) \u0026nbsp;\u003cspan style\u003d\"color: red;\"\u003eDatabase 'nono' does not exist. Make sure that the name is entered correctly. [SQLSTATE 08004] (Error 911) \u0026nbsp;BACKUP DATABASE is terminating abnormally. [SQLSTATE 42000] (Error 3013) \u003c/span\u003e\u0026nbsp;Processed 472 pages for database 'master', file 'master' on file 2. [SQLSTATE 01000] (Message 4035) \u0026nbsp;Processed 2 pages for database 'master', file 'mastlog' on file 2. [SQLSTATE 01000] (Message 4035) \u0026nbsp;BACKUP DATABASE successfully processed 474 pages in 0.534 seconds (6.933 MB/sec). [SQLSTATE 01000] (Message 3014). \u0026nbsp;The step failed.\u003c/code\u003e\n\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eSQL Server 2008 R2 and earlier job step detail\u003c/b\u003e\u003cbr /\u003e\n\u003ccode\u003eExecuted as user: NT AUTHORITY\\NETWORK SERVICE. Processed 1856 pages for database 'msdb', file 'MSDBData' on file 2. [SQLSTATE 01000] (Message 4035) \u0026nbsp;Processed 2 pages for database 'msdb', file 'MSDBLog' on file 2. [SQLSTATE 01000] (Message 4035) \u0026nbsp;\u003cspan style\u003d\"color: red;\"\u003eBACKUP DATABASE successfully processed 1858 pages in 2.716 seconds (5.344 MB/sec). [SQLSTATE 01000] (Message 3014) \u0026nbsp;Database 'nono' does not exist. Make sure that the name is entered correctly. [SQLSTATE 08004] (Error 911) \u0026nbsp;BACKUP DATABASE is terminating abnormally. [SQLSTATE 42000] (Error 3013). \u003c/span\u003e\u0026nbsp;The step failed.\u003c/code\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nYou may have already notice that in SQL Server 2008 R2 and earlier version, there is no backup for master database. The SQL agent job is terminated when it encounters an error. This is likely not the intended goal in this scenario. All other database backup should be performed even if there an error was encountered.\u003cbr /\u003e\n\u003cbr /\u003e\nIn order to mitigate this issue, we could utilize the sqlcmd with job step CmdExec within the SQL Agent job.\n\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eUSE [msdb]\nGO\n\nEXEC dbo.sp_add_job\n  @job_name\u003dN'Test Job (CmdExec)',\n  @enabled\u003d1;\n\nEXEC dbo.sp_add_jobstep\n  @job_name\u003dN'Test Job (CmdExec)',\n  @step_name\u003dN'Test Step (CmdExec)',\n  @subsystem\u003dN'CmdExec',\n  @command\u003dN'sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -Q \"EXEC [master].dbo.backupsp\" -b';\n\nEXEC dbo.sp_add_jobserver\n  @job_name\u003dN'Test Job (CmdExec)', @server_name \u003d N'(LOCAL)';\nGO\u003c/pre\u003e\n\u003cbr /\u003e\nThe -b option at the end of command returns a DOS ERRORLEVEL value when an error is encountered within the sqlcmd. This give us a better indication of the job failure. The SQL Server Agent token (SRVR) returns the computer and SQL Server instance name.\u003cbr /\u003e\n\u003cbr /\u003e\nStart the job,\u003cbr /\u003e\n\u003cpre class\u003d\"brush:sql\"\u003eEXEC dbo.sp_start_job @job_name \u003d N'Test Job (CmdExec)';\nGO\u003c/pre\u003e\n\u003cbr /\u003e\n\u003cb\u003eSQL Server 2014, SQL Server 2012, SQL Server 2008 R2 job step detail\u003c/b\u003e\u003cbr /\u003e\n\u003ccode\u003eExecuted as user: NT Service\\SQLSERVERAGENT. Processed 1728 pages for database 'msdb', file 'MSDBData' on file 4. \u0026nbsp;Processed 7 pages for database 'msdb', file 'MSDBLog' on file 4. \u0026nbsp;BACKUP DATABASE successfully processed 1735 pages in 2.613 seconds (5.186 MB/sec). \u0026nbsp;\u003cspan style\u003d\"color: red;\"\u003eMsg 911, Level 16, State 11, Server SQL2014, Procedure backupsp, Line 5 \u0026nbsp;Database 'nono' does not exist. Make sure that the name is entered correctly. \u0026nbsp;Msg 3013, Level 16, State 1, Server SQL2014, Procedure backupsp, Line 5 \u0026nbsp;BACKUP DATABASE is terminating abnormally. \u003c/span\u003e\u0026nbsp;Processed 472 pages for database 'master', file 'master' on file 3. \u0026nbsp;Processed 2 pages for database 'master', file 'mastlog' on file 3. \u0026nbsp;BACKUP DATABASE successfully processed 474 pages in 0.656 seconds (5.644 MB/sec). \u0026nbsp;Process Exit Code 1. \u0026nbsp;The step failed.\u003c/code\u003e\n\u003cbr /\u003e\n\u003cbr /\u003e\nExcept the executing user difference (WORKGROUP\\SQL2008R2$ for SQL Server 2008 R2), master database backup is performed for all SQL Server versions. Also, if you pay close attention, with sqlcmd, there is no\u0026nbsp;[SQLSTATE 08004] (Error 911) funky stuff at the end of each execution like T-SQL type does.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cb\u003eObservation\u003c/b\u003e\u003cbr /\u003e\nFor earlier SQL Server version, SQL Server agent job terminates immediately when it encounters error. This may produce unintended results as all the subsequent statements are not executed. This may become potential issue for backup, integrity check or index maintenance operations as it misses the rest of the executions.\n\u003cbr /\u003e\n\u003cbr /\u003e\nOne way to mitigate this behavior is to utilize the sqlcmd with CmdExec type within the SQL Server agent job. Sqlcmd complete the entire executions even if it encounters error.\n\u003cbr /\u003e\n\u003cbr /\u003e\nFrom SQL Server 2012 and later version, SQL Server agent job appears to complete all execution regardless if it encounters error."},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/5209260195598265989/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2014/04/sql-server-agent-job-and-sqlcmd.html#comment-form","title":"1 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/5209260195598265989"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/5209260195598265989"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2014/04/sql-server-agent-job-and-sqlcmd.html","title":"SQL Server Agent Job and SQLCMD"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"thr$total":{"$t":"1"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-8734448967395161608"},"published":{"$t":"2014-04-01T22:47:00.004-05:00"},"updated":{"$t":"2014-04-01T22:49:20.279-05:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"Self Learning"}],"title":{"type":"text","$t":"SQL Server 2014 RTM Release!"},"content":{"type":"html","$t":"Microsoft releases SQL Server 2014 RTM to public today! Yes, it's April Fool day, I know what you thinking but this is real. This post discuss different ways of getting the software.\u003cbr /\u003e\n\u003ca name\u003d'more'\u003e\u003c/a\u003e\u003cbr /\u003e\nSQL Server 2014 packs with a lot of new features like In-memory technologies (Code name Hekaton), buffer pool extension, new cardinality estimation logic, tighter integration with Windows Azure and many more!\u003cbr /\u003e\n\u003cbr /\u003e\nExcited? Let's get it!. There are multiple ways you could download SQL Server 2014. You could download the evaluation edition with 180-day trial at \u003ca href\u003d\"http://technet.microsoft.com/en-US/evalcenter/dn205290.aspx\" target\u003d\"_blank\"\u003eTechNet Evaulation Center\u003c/a\u003e. The evaluation edition contains all features available for Enterprise edition so you can test all these new features.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://technet.microsoft.com/en-US/evalcenter/dn205290.aspx\" target\u003d\"_blank\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://1.bp.blogspot.com/-zIFi1nXnHYU/UzuHCBT7mwI/AAAAAAAAC_4/7nBO7NbvGik/s1600/evaluation.png\" height\u003d\"148\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nIf you are MSDN subscriber, all SQL Server 2014 editions are available for download. Once you login to \u003ca href\u003d\"https://msdn.microsoft.com/en-us/subscriptions/downloads/\" target\u003d\"_blank\"\u003eMSDN download page\u003c/a\u003e, there is a new link for SQL Server 2014.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv style\u003d\"text-align: center;\"\u003e\n\u003ca href\u003d\"https://msdn.microsoft.com/en-us/subscriptions/downloads/\" target\u003d\"_blank\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://4.bp.blogspot.com/-PBB7ZWdZUaQ/UzszieMmB-I/AAAAAAAAC_Y/nMCy9ArcPyM/s1600/SQLServer2014DownloadOption.png\" /\u003e\u003c/a\u003e\n\u003c/div\u003e\n\u003cbr /\u003e\nHere shows partial list of downloadable images available for SQL Server 2014\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv style\u003d\"text-align: center;\"\u003e\n\u003cimg border\u003d\"0\" src\u003d\"http://2.bp.blogspot.com/-c_hB6QTfi7A/UzszibdArKI/AAAAAAAAC_c/kf-8-OuJ6_0/s1600/SQLServer2014DownloadList.png\" height\u003d\"380\" width\u003d\"400\" /\u003e\n\u003c/div\u003e\n\u003cbr /\u003e\nAnother way to get your hand on SQL Server 2014 is through \u003ca href\u003d\"http://www.windowsazure.com/en-us/\" target\u003d\"_blank\"\u003eWindows Azure\u003c/a\u003e. It takes only matter of minutes to provision a Windows Azure virtual machine with SQL Server 2014. The Windows Azure virtual machine gallery provides several images contain different version and edition of SQL Server. The picture below shows the selected image for Windows Server 2012 R2 with SQL Server 2014 RTM. \u0026nbsp;If you already a MSDN subscriber, definitely check this option out since there is free monthly Window Azure credits as well as discounted rate.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv style\u003d\"text-align: center;\"\u003e\n\u003ca href\u003d\"http://www.windowsazure.com/en-us/\" target\u003d\"_blank\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://3.bp.blogspot.com/-s9W96nVJntY/UzsziaMzj3I/AAAAAAAAC_k/-FJTZGuhMbI/s1600/SQLServer2014Azure.png\" height\u003d\"256\" width\u003d\"400\" /\u003e\u003c/a\u003e\n\u003c/div\u003e\n\u003cbr /\u003e\nTraditionally there will be a developer edition for sales (more detail \u003ca href\u003d\"http://www.travisgan.com/2014/01/sql-server-licensing-test-environment.html\" target\u003d\"_blank\"\u003ehere\u003c/a\u003e). Since Microsoft just release SQL Server 2014 today, it may takes some time for any online or physical stores to start selling them."},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/8734448967395161608/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2014/04/sql-server-2014-rtm-release.html#comment-form","title":"0 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/8734448967395161608"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/8734448967395161608"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2014/04/sql-server-2014-rtm-release.html","title":"SQL Server 2014 RTM Release!"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"media$thumbnail":{"xmlns$media":"http://search.yahoo.com/mrss/","url":"http://1.bp.blogspot.com/-zIFi1nXnHYU/UzuHCBT7mwI/AAAAAAAAC_4/7nBO7NbvGik/s72-c/evaluation.png","height":"72","width":"72"},"thr$total":{"$t":"0"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-927285248526539903"},"published":{"$t":"2014-03-25T06:00:00.001-05:00"},"updated":{"$t":"2022-08-15T23:57:45.989-05:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"PowerShell"},{"scheme":"http://www.blogger.com/atom/ns#","term":"Troubleshooting"}],"title":{"type":"text","$t":"Use PowerShell To Test Port"},"content":{"type":"html","$t":"There are times when we need to identify or troubleshoot if firewall exception is configured correctly and the desired remote server port is open. Often time, I have seen IT Professional use Telnet or PuTTY to test. However, there is another way to do it with PowerShell.\u003cbr /\u003e\n\u003ca name\u003d'more'\u003e\u003c/a\u003e\u003cbr /\u003e\nBy default, Telnet is not installed on Windows or Windows Server. In order to use Telnet, the Telnet client has to be installed. In Windows, this can be done through Control Panel \u0026gt; Programs and Features \u0026gt; Turn Windows features on or off, and check the Telnet Client. For Windows Server, go to Server Manager \u0026gt; Features \u0026gt; Add Features \u0026gt; Telnet Client.\u003cbr /\u003e\n\u003cbr /\u003e\nOr add the Telnet Client feature through PowerShell,\n\n\u003cpre class\u003d\"brush:ps\"\u003e\nImport-Module ServerManager\nAdd-WindowsFeature -Name Telnet-Client\n\u003c/pre\u003e\n\nInstead of installing the Telnet client, alternatively we could use Windows Sockets through the System.Net.Sockets provided in .NET framework.\n\u003cbr /\u003e\u003cbr /\u003e\nTo test TCP port,\n\n\u003cpre class\u003d\"brush:ps\"\u003e\n$tcp \u003d New-Object System.Net.Sockets.TcpClient\n$tcp.connect('\u0026lt;remote server\u0026gt;', \u0026lt;port\u0026gt;)\n\u003c/pre\u003e\n\nOr even one line of code if you wish,\n\n\u003cpre class\u003d\"brush:ps\"\u003e\n(New-Object System.Net.Sockets.TcpClient).Connect('\u0026lt;remote server\u0026gt;', \u0026lt;port\u0026gt;)\n\u003c/pre\u003e\n\nThat's it! No additional installation (provided that PowerShell is installed, of course). \n\u003cbr/\u003e\u003cbr/\u003e\nStarting in Windows Server 2012 R2, Microsoft includes Test-NetConnection in NetTCPIP Module. This command could be used to simulate the traditional ping, tracert, and testing TCP connectivity.\n\n\u003cpre class\u003d\"brush:ps\"\u003e\n# ping\nTest-NetConnection -ComputerName \u0026lt;remote server\u0026gt;\n\n# tcp test on a port\nTest-NetConnection -ComputerName \u0026lt;remote server\u0026gt; -Port \u0026lt;port\u0026gt;\n\n# tracert\nTest-NetConnection -ComputerName \u0026lt;remote server\u0026gt; -TraceRoute\n\u003c/pre\u003e\n\nA note on UDP port testing. UDPClient.connect only specify the host/port to that UDP client without actually connecting the remote host. Since UDP uses connectionless transmission which is hard to reliably determining if a remote UDP port is open/close. it seems like one way to test is to send some packets to the remote UDP port and if \"ICMP port unreachable\" is received back, the UDP port is considered as close. Otherwise, it is unknown.\n"},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/927285248526539903/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2014/03/use-powershell-to-test-port.html#comment-form","title":"0 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/927285248526539903"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/927285248526539903"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2014/03/use-powershell-to-test-port.html","title":"Use PowerShell To Test Port"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"thr$total":{"$t":"0"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-4923524882632074827"},"published":{"$t":"2014-03-24T06:00:00.000-05:00"},"updated":{"$t":"2014-03-24T06:00:08.657-05:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"PowerShell"}],"title":{"type":"text","$t":"PowerShell Cim Cmdlet to PowerShell 2.0 or older"},"content":{"type":"html","$t":"Starting in PowerShell 3.0, Microsoft introduces a new set of cmdlets to manage servers or devices that complies to \u003ca href\u003d\"http://www.dmtf.org/standards/cim\" target\u003d\"_blank\"\u003eDMTF\u003c/a\u003e standards. These new set of cmdlets, CIM (Common Information Model (CIM) Cmdlets allow IT professional to better manage their data center especially when it consists of servers from different vendors.\u003cbr /\u003e\n\u003ca name\u003d'more'\u003e\u003c/a\u003e\u003cbr /\u003e\nBefore PowerShell 3.0, IT professional commonly use WMI Cmdlets (WMI is Windows standard of the CIM Server) to obtain resources information for Windows, and WSMan Cmdlets for Non-Windows. In order to allow and simplify management on heterogeneous environment, a standard compliance and richer PowerShell CIM Cmdlets are introduced. You can find a list of CIM Cmdlets from this\u0026nbsp;\u003ca href\u003d\"http://technet.microsoft.com/en-us/library/jj553783.aspx\" target\u003d\"_blank\"\u003eTechNet article\u003c/a\u003e. This \u003ca href\u003d\"http://blogs.msdn.com/b/powershell/archive/2012/08/24/introduction-to-cim-cmdlets.aspx\" target\u003d\"_blank\"\u003emsdn article\u003c/a\u003e provide a good introduction of CIM cmdlet as well comparison with the older WMI.\u003cbr /\u003e\n\u003cbr /\u003e\nWindows Server 2012 and Windows 8 shipped with PowerShell 3.0 which support these new CIM Cmdlets. For servers or devices with older version of PowerShell, it doesn't support CIM Cmdlets directly.\u003cbr /\u003e\n\u003cbr /\u003e\nFor example, if we want to get some processor information of a remote server (e.g. Server1), traditionally we use WMI cmdlets like this,\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ccode\u003e\nGet-WmiObject -Class Win32_Processor -ComputerName Server1 | Select-Object -Property SocketDesignation, NumberOfCores, NumberOfLogicalProcessors, CurrentClockSpeed\u003c/code\u003e\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-ZoGuVxrhV_E/Uy7jAwfJO5I/AAAAAAAAC-U/ly1RbjWGDG0/s1600/wmi.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://1.bp.blogspot.com/-ZoGuVxrhV_E/Uy7jAwfJO5I/AAAAAAAAC-U/ly1RbjWGDG0/s1600/wmi.png\" height\u003d\"50\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nWin32_Processor represents instances for each processor\u003cbr /\u003e\nGet-WmiObject get instance of WMI class in this case Win32_Processor\u003cbr /\u003e\n\u003cbr /\u003e\nTo use the new CIM Cmdlet, for the remote server (e.g. Server1) that has at least PowerShell 3.0, we could execute the command like this (try using the tab completion for ClassName like *processor for for richer experience),\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ccode\u003e\nGet-CimInstance -ClassName CIM_Processor -ComputerName Server1 | Select-Object\u0026nbsp;SocketDesignation, NumberOfCores, NumberOfLogicalProcessors, CurrentClockSpeed\u003c/code\u003e\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-EhqJ6B6PDUM/Uy5lCLh82aI/AAAAAAAAC90/zzEPMfE-8IA/s1600/ciminstance-3.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://1.bp.blogspot.com/-EhqJ6B6PDUM/Uy5lCLh82aI/AAAAAAAAC90/zzEPMfE-8IA/s1600/ciminstance-3.png\" height\u003d\"55\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nCIM_Processor represents instance for each processor. This CIM_Processor class is parent class for Win32_Processor\u003cbr /\u003e\nGet-CimInstance gets the CIM instances of a class, in this case CIM_Processor\u003cbr /\u003e\n\u003cbr /\u003e\nIf the same command issued to obtain processor information from another server (e.g server2) that currently installed with PowerShell 2.0,\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-MyIl6CWtZtw/Uy5lCLX5lXI/AAAAAAAAC94/YGizUR4cZ1Q/s1600/ciminstance-2-fail.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://1.bp.blogspot.com/-MyIl6CWtZtw/Uy5lCLX5lXI/AAAAAAAAC94/YGizUR4cZ1Q/s1600/ciminstance-2-fail.png\" height\u003d\"75\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nThe command failed.\u003cbr /\u003e\n\u003cbr /\u003e\nThe reason is because Get-CIMInstance Cmdlet is using\u0026nbsp;WinRM (Windows implementation of WSMAN protocol)\u0026nbsp;which is only supported from PowerShell\u0026nbsp;3.0. In order to execute the Get-CIMInstance command on Server2 with PowerShell 2.0 version installed, we have to force the CIM cmdlet to use DCOM protocol (used by WMI) which is supported in all PowerShell version (As of this post).\u003cbr /\u003e\n\u003cbr /\u003e\nTo do that, we use the CIM Session which provide an option of using DCOM or WSMAN. In this case, select the DCOM and execute the Get-CIMInstance against the CIM session.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ccode\u003e\n$so \u003d New-CimSessionOption -Protocol Dcom\u003cbr /\u003e\n$s \u003d New-CimSession -ComputerName Server2 -SessionOption $so\u003cbr /\u003e\nGet-CimInstance -ClassName CIM_Processor -CimSession $s |\u003cbr /\u003e\nSelect-Object -Property SocketDesignation, NumberOfCores, NumberOfLogicalProcessors, CurrentClockSpeed\u003c/code\u003e\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-Xb3LLPuVtPs/Uy5lCOlNaeI/AAAAAAAAC-A/mKFEZrTJaRA/s1600/ciminstance-2-success.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://2.bp.blogspot.com/-Xb3LLPuVtPs/Uy5lCOlNaeI/AAAAAAAAC-A/mKFEZrTJaRA/s1600/ciminstance-2-success.png\" height\u003d\"87\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nIt is important to know which protocol is being used as it uses different ports which will affect firewall exception setup.\u003cbr /\u003e\n\u003cbr /\u003e\nIf the firewall is not configured correctly, you may receive this error.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ccode\u003e\u003cspan style\u003d\"color: red;\"\u003eNew-CimSession : The RPC server is unavailable.\u003c/span\u003e\u003c/code\u003e\u003cbr /\u003e\n\u003cbr /\u003e\nDCOM assigns TCP port dynamically from 1024 through 65535. In order for WMI cmdlets to connect the remote server successfully, at remote server enable the Windows Firewall: Allow remote administrator exception (gpedit.msc \u0026gt; local computer policy \u0026gt; computer configuration \u0026gt; administrative templates \u0026gt; network, network connections \u0026gt; windows firewall \u0026gt; domain/standard profile), which will enable these RPC firewall rules as shown below.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-zCgOwmU6o8s/Uy5VZTyNahI/AAAAAAAAC9k/INN503GpqdY/s1600/RPC-firewall.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://2.bp.blogspot.com/-zCgOwmU6o8s/Uy5VZTyNahI/AAAAAAAAC9k/INN503GpqdY/s1600/RPC-firewall.png\" height\u003d\"52\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nFor asynchronous callback from remote computer, the local computer need to enable the Windows Management Instrumentation (Async-In) firewall rule and sometimes (DCOM-In) firewall rule. One example is the remote SQL Server Configuration Manager. More information at this \u003ca href\u003d\"http://www.travisgan.com/2013/12/windows-firewall-to-remote-sql-server.html\" target\u003d\"_blank\"\u003epost\u003c/a\u003e.\u003cbr /\u003e\n\u003cbr /\u003e\nThe easier way for WMI firewall setup I found is go to control panel \u0026gt; System and Security \u0026gt; Windows Firewall \u0026gt; \u0026nbsp;Allow a program or feature through Windows Firewall, Select Windows Management Instrumentation (WMI).\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-kc3MIlL8AHw/Uy82z8p1UeI/AAAAAAAAC-o/YdGnUNwJBGE/s1600/wmi-allow.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://1.bp.blogspot.com/-kc3MIlL8AHw/Uy82z8p1UeI/AAAAAAAAC-o/YdGnUNwJBGE/s1600/wmi-allow.png\" height\u003d\"144\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nThis will enable the firewall rules as shown below.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-IocPadu53T8/Uy82ziw4PqI/AAAAAAAAC-s/-F3xw4evmos/s1600/wmi-allow-firewall.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://3.bp.blogspot.com/-IocPadu53T8/Uy82ziw4PqI/AAAAAAAAC-s/-F3xw4evmos/s1600/wmi-allow-firewall.png\" height\u003d\"40\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nAs long as there are no RPC or other similar firewall rules blocking the ports, these WMI firewall take care of the required WMI firewall exceptions.\u003cbr /\u003e\n\u003cbr /\u003e\nWinRM protocol is a SOAP (standard simple object protocol) based protocol that uses TCP port 5985 for HTTP transport and 5986 for HTTPS.\u003cbr /\u003e\n\u003cbr /\u003e\nTo enable the firewall, you can run \u003ccode\u003ewinrm quickconfig\u003c/code\u003e on powershell on the remote server, which will enable the firewall rules show below.\u003cbr /\u003e\n\u003cbr /\u003e\nSimilarly, we could configure the Windows Remote Management through Windows Firewall and Allow a program or feature through Windows Firewall.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-YaZKTfdefWc/Uy85NlYDtvI/AAAAAAAAC-4/BjSclSdxk1s/s1600/winrm-allow.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://1.bp.blogspot.com/-YaZKTfdefWc/Uy85NlYDtvI/AAAAAAAAC-4/BjSclSdxk1s/s1600/winrm-allow.png\" height\u003d\"170\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nThese are the firewall rules enabled.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-ZH96PlGODeo/Uy85eqmmUzI/AAAAAAAAC_I/7MMdGSsBHiU/s1600/winrm-allow-firewall.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" src\u003d\"http://3.bp.blogspot.com/-ZH96PlGODeo/Uy85eqmmUzI/AAAAAAAAC_I/7MMdGSsBHiU/s1600/winrm-allow-firewall.png\" height\u003d\"30\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003c/div\u003e\nThe compatibility mode is for the older WinRM (WinRM 1.1) which uses the common port 80. Since WinRM 2.0, the port has changed to 5985. Also, if required, HTTPS firewall rule could be configured.\u003cbr /\u003e\n\u003cbr /\u003e\nIf the firewall for WinRM is not configured correctly, you may receive this error when using CIM Cmdlets with the default WinRM (or specified in the session option to use WSMAN) protocol.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ccode\u003e\u003cspan style\u003d\"color: red;\"\u003e\nGet-CimInstance : WinRM cannot complete the operation. Verify that the specified computer name is valid, that the computer is accessible over the network, and that a firewall exception for the WinRM service is enabled and allows access from this computer. By default, the WinRM firewall exception for public profiles limits access to remote computers within the same local subnet.\u003c/span\u003e\u003c/code\u003e\u003cbr /\u003e\n\u003cbr /\u003e\nHope this post help you to use CIM cmdlets against older PowerShell version and the firewall setup."},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/4923524882632074827/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2014/03/powershell-cim-cmdlet-to-powershell-20.html#comment-form","title":"0 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/4923524882632074827"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/4923524882632074827"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2014/03/powershell-cim-cmdlet-to-powershell-20.html","title":"PowerShell Cim Cmdlet to PowerShell 2.0 or older"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"media$thumbnail":{"xmlns$media":"http://search.yahoo.com/mrss/","url":"http://1.bp.blogspot.com/-ZoGuVxrhV_E/Uy7jAwfJO5I/AAAAAAAAC-U/ly1RbjWGDG0/s72-c/wmi.png","height":"72","width":"72"},"thr$total":{"$t":"0"}},{"id":{"$t":"tag:blogger.com,1999:blog-3846425018755283358.post-2441974662477405342"},"published":{"$t":"2014-03-21T06:00:00.000-05:00"},"updated":{"$t":"2018-03-11T22:57:13.753-05:00"},"category":[{"scheme":"http://www.blogger.com/atom/ns#","term":"PowerShell"}],"title":{"type":"text","$t":"Enable PowerShell Double-Hop Remoting"},"content":{"type":"html","$t":"When working with PowerShell, there are cases when there is a need to remote into a computer with PowerShell session and perform tasks against another computer. These are some of the typical double hop scenario common in PowerShell.\u003cbr /\u003e\n\u003ca name\u003d'more'\u003e\u003c/a\u003e\u003cbr /\u003e\nTo illustrate with a simple example, there is a user, testuser and his computer, client1 and there are two servers, server1 and server2. TestUser remotes to server1 using Enter-PSSession (first hop). Within that remote session, user executes WMI command to obtain the operating system version of server2 (second hop).\u003cbr /\u003e\n\u003cbr /\u003e\nNote: TestUser could actually get the WMI information of server2 from client1, this is just a simple example.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-qr3L1MyNwW8/UyuonMylrYI/AAAAAAAAC8M/dp5RDK1brd4/s1600/doublehop.PNG\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"66\" src\u003d\"https://3.bp.blogspot.com/-qr3L1MyNwW8/UyuonMylrYI/AAAAAAAAC8M/dp5RDK1brd4/s1600/doublehop.PNG\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nWhen user remotes to the server1 from client1, how the user being authenticated and its credential passed to the server1 depending on the type of authentication. The first hop authentication usually is pretty straight forward and could be done using NTLM, Kerberos or other authentications. The tricky part is the authentication on the second hop. In order to allow user credential to be passed for second hop authentication, usually it uses kerberos authentication at both first and second hop. Keberos authentication requires all parties to be in the same active directory domain, SPN registration as well as user account enabled for delegation. However when kerberos delegation can not be used, PowerShell WinRM support Credential Security Service Provider (CredSSP) for authentication.\u003cbr /\u003e\n\u003cbr /\u003e\nHere is the PowerShell commands for the above mentioned scenario without CredSSP,\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ccode\u003e\nPS C:\\\u0026gt; $cred \u003d Get-Credential\u003cbr /\u003e\nPS C:\\\u0026gt; Enter-PSSession -ComputerName server1 -Credential $cred\u003cbr /\u003e\n[server1]: PS C:\\Users\\testuser\\Documents\u0026gt; Get-WmiObject -Class Win32_OperatingSystem -ComputerName server2 | Select-Object -ExpandProperty caption\u003c/code\u003e\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ccode\u003e\u003cspan style\u003d\"color: red;\"\u003eAccess is denied. (Exception from HRESULT: 0x80070005 (E_ACCESSDENIED))\u003c/span\u003e\u003c/code\u003e\u003cbr /\u003e\n\u003cbr /\u003e\nChecking the security log of server1, it uses kerberos authentication.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-bveL5SYszt8/Uyu_7b9UfNI/AAAAAAAAC8c/bDfOVlE2OzM/s1600/firsthop-auth.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"259\" src\u003d\"https://4.bp.blogspot.com/-bveL5SYszt8/Uyu_7b9UfNI/AAAAAAAAC8c/bDfOVlE2OzM/s1600/firsthop-auth.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nSince the account has not enabled for delegation on Active Directory, the Kerberos delegation can't be used for the second hop. The security log on server2 shows NT AUTHORITY\\ANONYMOUS LOGON as the login credential, hence the access denied error.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-4hjU5JEg3ew/Uyu_8hdVovI/AAAAAAAAC8k/FIOS2vAAOsE/s1600/secondhop-auth.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"242\" src\u003d\"https://2.bp.blogspot.com/-4hjU5JEg3ew/Uyu_8hdVovI/AAAAAAAAC8k/FIOS2vAAOsE/s1600/secondhop-auth.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nTo enable Credential Security Support Provider (CredSSP) authentication, run this command on client computer to enable client role and specify a computer to delegate the credential,\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ccode\u003eEnable-WSManCredSSP -Role Client -DelegateComputer server1.testdomain.com\u003c/code\u003e\u003cbr /\u003e\n\u003cbr /\u003e\nIf WinRM service is not running, you may receive this error.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ccode\u003e\u003cspan style\u003d\"color: red;\"\u003e\nEnable-WSManCredSSP : \u0026lt;f:WSManFault xmlns:f\u003d\"http://schemas.microsoft.com/wbem/wsman/1/wsmanfault\" Code\u003d\"2150858770\"\u0026nbsp;Machine\u003d\"client1.testdomain.com\"\u0026gt;\u0026lt;f:Message\u0026gt;The client cannot connect to the destination specified in the request. Verify\u0026nbsp;that the service on the destination is running and is accepting requests. Consult the logs and documentation for the\u0026nbsp;WS-Management service running on the destination, most commonly IIS or WinRM. If the destination is the WinRM\u0026nbsp;service, run the following command on the destination to analyze and configure the WinRM service: \"winrm\u0026nbsp;quickconfig\". \u0026lt;/f:Message\u0026gt;\u0026lt;/f:WSManFault\u0026gt;\u003c/span\u003e\u003c/code\u003e\u003cbr /\u003e\n\u003cbr /\u003e\nRun \u003ccode\u003ewinrm quickconfig\u003c/code\u003e to verify if WinRM is setup correctly. Turn the WinRM service on when prompted (the service startup is also set to delayed auto start). Note, you don't need to allow remote access to the client machine if not required.\u003cbr /\u003e\n\u003cbr /\u003e\nNow try to enable the CredSSP again.\u003cbr /\u003e\n\u003cbr /\u003e\nOn client computer,\u003cbr /\u003e\n\u003ccode\u003e\nEnable-WSManCredSSP -Role Client -DelegateComputer server1.testdomain.com\u003c/code\u003e\u003cbr /\u003e\n\u003cbr /\u003e\nNote: you can use *.testdomain.com to allow all computers within the specified domain to be able to delegate the credential from this client.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-w7JKsSDopGI/UypumQBVxCI/AAAAAAAAC70/PF6bfLTSqOg/s1600/enable-wsmancredssp_client.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"105\" src\u003d\"https://4.bp.blogspot.com/-w7JKsSDopGI/UypumQBVxCI/AAAAAAAAC70/PF6bfLTSqOg/s1600/enable-wsmancredssp_client.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003c/div\u003e\nTo verify,\u003cbr /\u003e\n\u003ccode\u003eGet-WSManCredSSP\u003c/code\u003e\u003cbr /\u003e\n\u003cbr /\u003e\nThe machine is configured to allow delegating fresh credentials to the following target(s): wsm\u003cbr /\u003e\nan/server1.testdomain.com\u003cbr /\u003e\nThis computer is not configured to receive credentials from a remote client computer.\u003cbr /\u003e\n\u003cbr /\u003e\nAlternatively, you can verify through gpedit.msc.\u003cbr /\u003e\nRun gpedit.msc, expand Computer Configuration, expand Administrative Templates, expand System, select Delegation,\u003cbr /\u003e\n\u003cbr /\u003e\nThe 'Allow Delegating Fresh Credentials' should be enabled.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-V5h4Dyds1ZE/UyptFCvdJ9I/AAAAAAAAC7Y/FKaBEzLHQKg/s1600/gpedit.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"176\" src\u003d\"https://4.bp.blogspot.com/-V5h4Dyds1ZE/UyptFCvdJ9I/AAAAAAAAC7Y/FKaBEzLHQKg/s1600/gpedit.png\" width\u003d\"400\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nDouble clicking the setting to view detail\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-oFBNj8TKKnE/UyptHZSOekI/AAAAAAAAC7g/x1HY4m5nrNk/s1600/gpedit_allow.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"194\" src\u003d\"https://2.bp.blogspot.com/-oFBNj8TKKnE/UyptHZSOekI/AAAAAAAAC7g/x1HY4m5nrNk/s1600/gpedit_allow.png\" width\u003d\"200\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nClick the show button to see the list of servers that have been added to allow delegation.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://1.bp.blogspot.com/-ENfIMMQuc-Y/UyvF3ciEXqI/AAAAAAAAC9E/U9W9LUe4-2M/s1600/gpedit_list.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"135\" src\u003d\"https://1.bp.blogspot.com/-ENfIMMQuc-Y/UyvF3ciEXqI/AAAAAAAAC9E/U9W9LUe4-2M/s1600/gpedit_list.png\" width\u003d\"200\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003c/div\u003e\nOn remote computer (e.g. server1 in this case)\u003cbr /\u003e\nMake sure that HTTP or HTTPS listener has been setup. Run \u003ccode\u003ewinrm quickconfig\u003c/code\u003e to make sure it is properly setup.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ccode\u003e\nEnable-WSManCredSSP -Role Server -Force\u003c/code\u003e\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://3.bp.blogspot.com/-J37OvZQ3iog/UyvGRjEatfI/AAAAAAAAC9M/0LaqHtcCNgw/s1600/enable-wsmancredssp_server.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"82\" src\u003d\"https://3.bp.blogspot.com/-J37OvZQ3iog/UyvGRjEatfI/AAAAAAAAC9M/0LaqHtcCNgw/s1600/enable-wsmancredssp_server.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003cbr /\u003e\u003c/div\u003e\nNotice that you can actually use PowerShell sessioon (pssession) to connect to the remote server and execute the command.\u003cbr /\u003e\n\u003cbr /\u003e\nTo verify,\u003cbr /\u003e\n\u003ccode\u003eGet-WSManCredSSP\u003c/code\u003e\u003cbr /\u003e\n\u003cbr /\u003e\nThe machine is not configured to allow delegating fresh credentials.\u003cbr /\u003e\nThis computer is configured to receive credentials from a remote client computer.\u003cbr /\u003e\n\u003cbr /\u003e\nNow, try again with our initial command. This time with the CredSSP authentication,\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ccode\u003e\nPS C:\\\u0026gt; $cred \u003d Get-Credential\u003cbr /\u003e\nPS C:\\\u0026gt; Enter-PSSession -ComputerName server1-Credential $cred -Authentication Credssp\n\u003c/code\u003e\u003cbr /\u003e\n\u003cbr /\u003e\nThis error encountered,\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ccode\u003e\u003cspan style\u003d\"color: red;\"\u003e\nEnter-PSSession : Connecting to remote server server1 failed with the following error message : The WinRM client cannot process the request. A computer policy does not allow the delegation of the user credentials to the target computer. Use gpedit.msc and look at the following policy: Computer Configuration -\u0026gt; Administrative Templates -\u0026gt; System -\u0026gt; Credentials Delegation -\u0026gt; Allow Delegating Fresh Credentials. \u0026nbsp;Verify that it is enabled and configured with an SPN appropriate for the target computer. For example, for a target computer name \"myserver.domain.com\", the SPN can be one of the following: WSMAN/myserver.domain.com or WSMAN/*.domain.com. For more information, see the about_Remote_Troubleshooting Help topic.\u003c/span\u003e\u003c/code\u003e\u003cbr /\u003e\n\u003cbr /\u003e\nThe ComputerName specified has to match the DelegateComputer which is the FQDN (fully qualified domain name) specified earlier.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003ccode\u003e\nPS C:\\\u0026gt; $cred \u003d Get-Credential\u003cbr /\u003e\nPS C:\\\u0026gt; Enter-PSSession -ComputerName server1.testdomain.com -Credential $cred -Authentication Credssp\u003cbr /\u003e\n[server1.testdomain.com]: PS C:\\Users\\testuser\\Documents\u0026gt; Get-WmiObject -Class Win32_OperatingSystem -ComputerName server2 | Select-Object -ExpandProperty caption\u003c/code\u003e\u003cbr /\u003e\n\u003cbr /\u003e\nMicrosoft Windows Server 2008 R2 Standard \u003cbr /\u003e\n\u003cbr /\u003e\nHere is the security log for server1. The first hop is using kerberos authentication. Subsequently, the WSMAN Negotiate authentication for the CredSSP.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-bveL5SYszt8/Uyu_7b9UfNI/AAAAAAAAC8g/jOtRRICsGNc/s1600/firsthop-auth.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"259\" src\u003d\"https://2.bp.blogspot.com/-bveL5SYszt8/Uyu_7b9UfNI/AAAAAAAAC8g/jOtRRICsGNc/s1600/firsthop-auth.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://2.bp.blogspot.com/-byiMayhEYcg/UyvAqtEkaFI/AAAAAAAAC8s/WNeZhQvaakk/s1600/firsthopws-auth.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"224\" src\u003d\"https://2.bp.blogspot.com/-byiMayhEYcg/UyvAqtEkaFI/AAAAAAAAC8s/WNeZhQvaakk/s1600/firsthopws-auth.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e\nSince the client1 has been setup to allow server1 to delegate its user credential (TestUser) through CredSSP, the second hop authentication uses kerberos authentication with TestUser credential.\u003cbr /\u003e\n\u003cbr /\u003e\n\u003cdiv class\u003d\"separator\" style\u003d\"clear: both; text-align: center;\"\u003e\n\u003ca href\u003d\"http://4.bp.blogspot.com/-jMF9vWtYGpU/UyvAqqO5JAI/AAAAAAAAC80/9ODGbxRb4IQ/s1600/secondhop-kerberos-auth.png\" imageanchor\u003d\"1\" style\u003d\"margin-left: 1em; margin-right: 1em;\"\u003e\u003cimg border\u003d\"0\" height\u003d\"228\" src\u003d\"https://4.bp.blogspot.com/-jMF9vWtYGpU/UyvAqqO5JAI/AAAAAAAAC80/9ODGbxRb4IQ/s1600/secondhop-kerberos-auth.png\" width\u003d\"320\" /\u003e\u003c/a\u003e\u003c/div\u003e\n\u003cbr /\u003e"},"link":[{"rel":"replies","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/2441974662477405342/comments/default","title":"Post Comments"},{"rel":"replies","type":"text/html","href":"https://www.travisgan.com/2014/03/enable-powershell-double-hop-remoting.html#comment-form","title":"0 Comments"},{"rel":"edit","type":"application/atom+xml","href":"https://draft.blogger.com/feeds/3846425018755283358/posts/default/2441974662477405342"},{"rel":"self","type":"application/atom+xml","href":"https://www.travisgan.com/feeds/posts/default/2441974662477405342"},{"rel":"alternate","type":"text/html","href":"https://www.travisgan.com/2014/03/enable-powershell-double-hop-remoting.html","title":"Enable PowerShell Double-Hop Remoting"}],"author":[{"name":{"$t":"Travis"},"email":{"$t":"noreply@blogger.com"},"gd$image":{"rel":"http://schemas.google.com/g/2005#thumbnail","width":"35","height":"35","src":"//www.blogger.com/img/blogger_logo_round_35.png"}}],"media$thumbnail":{"xmlns$media":"http://search.yahoo.com/mrss/","url":"https://3.bp.blogspot.com/-qr3L1MyNwW8/UyuonMylrYI/AAAAAAAAC8M/dp5RDK1brd4/s72-c/doublehop.PNG","height":"72","width":"72"},"thr$total":{"$t":"0"}}]}});