<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[Steve Lathrop]]></title><description><![CDATA[Senior Dev/Lead/Manager]]></description><link>https://stevelathrop.net/</link><generator>Ghost 0.7</generator><lastBuildDate>Wed, 21 Nov 2018 06:54:21 GMT</lastBuildDate><atom:link href="https://stevelathrop.net/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Securing a Node.js REST API With Azure AD JWT Bearer Tokens]]></title><description><![CDATA[<p>Here's how to integrate Azure AD authentication with a Node.js REST API, for example. Specifically, here are the details on verifying an Azure AD-generated JWT Bearer Token.</p>

<h2 id="tldr">TL;DR</h2>

<ul>
<li><code>git clone</code> or download the project I have on GitHub <a href="https://github.com/slathrop/jwt-azure-ad-bearer-example">here</a></li>
<li>In <code>index.js</code> paste your Bearer token string (Base64,</li></ul>]]></description><link>https://stevelathrop.net/securing-a-node-js-rest-api-with-azure-ad-jwt-bearer-tokens/</link><guid isPermaLink="false">5d15260b-337a-4c2b-8109-3360fb26788b</guid><dc:creator><![CDATA[Steve Lathrop]]></dc:creator><pubDate>Mon, 05 Feb 2018 06:14:04 GMT</pubDate><content:encoded><![CDATA[<p>Here's how to integrate Azure AD authentication with a Node.js REST API, for example. Specifically, here are the details on verifying an Azure AD-generated JWT Bearer Token.</p>

<h2 id="tldr">TL;DR</h2>

<ul>
<li><code>git clone</code> or download the project I have on GitHub <a href="https://github.com/slathrop/jwt-azure-ad-bearer-example">here</a></li>
<li>In <code>index.js</code> paste your Bearer token string (Base64, no "Bearer " prefix) into the <code>token</code> variable</li>
<li>Paste your public key X.509 Certificate string (without PEM prefix/suffix) into the <code>x5cString</code> variable</li>
<li>Run <code>npm install</code> and then <code>node .</code> from the command-line</li>
</ul>

<p>If your token is printed out on the console then verification/validation succeeded. Otherwise, an error message will be displayed.</p>

<p>Consider using additional <a href="https://github.com/auth0/node-jsonwebtoken#jwtverifytoken-secretorpublickey-options-callback">verify options</a> for improved security once you have the basic public key verification working.</p>

<h2 id="background">Background</h2>

<p>Let's say that you have an API endpoint using Node.js (Express, LoopBack, Feathers, etc.) and you want to accept <a href="https://jwt.io/introduction/#how-do-json-web-tokens-work-">JWT Bearer Tokens</a> issued by <a href="https://docs.microsoft.com/en-us/azure/active-directory/develop/active-directory-dev-understanding-oauth2-implicit-grant">Azure AD</a>. This is a terrific, stateless way of doing single-sign-on (SSO) between say, Microsoft Office 365 or SharePoint and your own custom single-page application (SPA).</p>

<p>You'll find some <a href="https://github.com/matt-ankerson/vue-adal">good examples</a> for using <a href="https://github.com/AzureAD/azure-activedirectory-library-for-js">ADAL</a> within the browser to get the Microsoft Azure AD-signed Bearer Token.</p>

<p>However, what is perhaps not so clear is how to validate or <strong>verify</strong> the Bearer token on the Node.js side in your API code. This little show-and-tell blog article describes the "trick" needed to perform the JWT Bearer token verification.</p>

<h2 id="startdowntherabbithole">Start Down the Rabbit Hole</h2>

<p>I'm going to say right off the bat that Microsoft seems to make this whole thing a bit more complicated than it has to be. I started by reading <a href="https://docs.microsoft.com/en-us/azure/active-directory/develop/active-directory-token-and-claims#validating-tokens">this</a> article, which tells me to examine <a href="https://login.microsoftonline.com/common/.well-known/openid-configuration">this</a> online JSON document that contains a <code>"jwks_uri"</code> value pointing to <a href="https://login.microsoftonline.com/common/discovery/keys">this URI</a> where the actual public keys are stored.</p>

<p>The public keys are provided in JSON format as well, and you must cross-reference into them to find the right one for your particular Office 365 tenancy presumably.</p>

<p>Let me explain what I mean by "cross-reference". Here's what the public keys JSON file looks like. Notice that there are 3 elements (objects) in the array of public key data (as of this writing).</p>

<pre><code class="language-javascript">{
  "keys": [
    {
      "kty":"RSA",
      "use":"sig",
      "kid":"z44wMdHu8wKsumrbfaK98qxs5YI",
      "x5t":"z44wMdHu8wKsumrbfaK98qxs5YI",
      "n":"p3HhMQsqmgSjeiDsZ1ay...",
      "e":"AQAB",
      "x5c": ["MIIDBTCCAe2gAwIBAgIQaD0..."]
    },
    {
      "kty":"RSA",
      "use":"sig",
      "kid":"SSQdhI1cKvhQEDSJxE2gGYs40Q0",
      "x5t":"SSQdhI1cKvhQEDSJxE2gGYs40Q0",
      "n":"pJUB90EMxiNjgkVz5CLL...",
      "e":"AQAB",
      "x5c":["MIIDBTCCAe2gAwIBAgIQHJ7yHxN..."]
    },
    {
      "kty":"RSA",
      "use":"sig",
      "kid":"2S4SCVGs8Sg9LS6AqLIq6DpW-g8",
      "x5t":"2S4SCVGs8Sg9LS6AqLIq6DpW-g8",
      "n":"oZ-QQrNuB4ei9ATYrT61ebPt...",
      "e":"AQAB",
      "x5c":["MIIDKDCCAhCgAwIBAgIQBH..."]
    }
  ]
}
</code></pre>

<p>The main index into the objects in the array is the <code>"x5t"</code> value (X.509 "Thumbprint", see <a href="https://tools.ietf.org/html/draft-ietf-jose-json-web-signature-31#page-12">this spec</a>):</p>

<ul>
<li><code>z44wMdHu8wKsumrbfaK98qxs5YI</code>,</li>
<li><code>SSQdhI1cKvhQEDSJxE2gGYs40Q0</code>, or</li>
<li><code>2S4SCVGs8Sg9LS6AqLIq6DpW-g8</code></li>
</ul>

<p>Following so far? Great.</p>

<h2 id="crossreferenceazureadissuedtokentogetthecorrectpublickey">Cross-Reference Azure AD-Issued Token to get the Correct Public Key</h2>

<p>To determine which public key your particular Bearer token can be verified with, examine the corresponding <code>"x5t"</code> value in the <strong>header</strong> section of <strong>your</strong> Bearer token.</p>

<p>Then, from the matching object in the <code>keys</code> array (shown above and as mentioned above, available <a href="https://login.microsoftonline.com/common/discovery/keys">here</a>), take the <code>"x5c"</code> value to construct your actual public key for token verification purposes.</p>

<h2 id="constructthepublickey">Construct the Public Key</h2>

<p>Now that you have the correct <code>"x5c"</code> string you're almost ready to verify your Bearer token! Wow, right?</p>

<p>The "trick" in this final step is to note that the <code>"x5c"</code> string is, as the label implies, an <a href="https://en.wikipedia.org/wiki/X.509">X.509 Certificate</a>.</p>

<p>Therefore, to successfully use this <code>"x5c"</code> string as a public key argument to a JWT token verification call, you need to perform a small string concatenation to put it into the <a href="https://en.wikipedia.org/wiki/X.509#Certificate_filename_extensions">expected PEM format</a> (also see <a href="https://en.wikipedia.org/wiki/Privacy-enhanced_Electronic_Mail">here</a>).</p>

<pre><code class="language-javascript">var token = '...';  
var x5cString = '...';  
var publicKey = '-----BEGIN CERTIFICATE-----\n' + x5cString + '\n-----END CERTIFICATE-----';  
...

// Verify
verifiedToken = jwt.verify(token, publicKey);
</code></pre>]]></content:encoded></item><item><title><![CDATA[.NET Core 2.0 Has Finally Landed!]]></title><description><![CDATA[<p>I'm reminded of Tom Hanks as Chuck Noland in <strong>Cast Away</strong>, struggling to push past the crashing waves to escape his lonely little island on a hand-built raft, seeking the freedom of the open sea and the help he hoped to find there.</p>

<p></p><p id="vid1" style="margin-bottom:-5px;font-size:small;color:darkgray">"Raft 2.0" Scene in <b>Cast Away</b></p>]]></description><link>https://stevelathrop.net/net-core-2-0-has-finally-landed/</link><guid isPermaLink="false">960c787a-e72d-499e-a258-94639cfc1e1b</guid><category><![CDATA[C#]]></category><category><![CDATA[Visual Studio]]></category><category><![CDATA[Microsoft]]></category><category><![CDATA[.NET]]></category><dc:creator><![CDATA[Steve Lathrop]]></dc:creator><pubDate>Wed, 16 Aug 2017 16:27:00 GMT</pubDate><content:encoded><![CDATA[<p>I'm reminded of Tom Hanks as Chuck Noland in <strong>Cast Away</strong>, struggling to push past the crashing waves to escape his lonely little island on a hand-built raft, seeking the freedom of the open sea and the help he hoped to find there.</p>

<p></p><p id="vid1" style="margin-bottom:-5px;font-size:small;color:darkgray">"Raft 2.0" Scene in <b>Cast Away</b></p><div style="padding-top:15px"><iframe width="853" height="480" src="https://www.youtube.com/embed/R8y-6suAexU?start=26" frameborder="0" allowfullscreen></iframe></div><p></p>

<p>An apt metaphor for Microsoft and .NET Core pushing into previously uncharted waters (for them) and the world of free and open source software.</p>

<p>Because remember, there was "Raft 1.0".</p>

<p><center><img src="http://3.bp.blogspot.com/-XAWUwo1arV0/TjqF6e5YB-I/AAAAAAAACTs/07ROiwxmiXU/s1600/cast-away-screenshots23.jpg" alt="&quot;Raft 1.0&quot; Fail" title=""></center></p>

<p>I'm not saying that .NET Core 1.0 was a failure. Just as Chuck Noland learned from his 1.0 effort, Microsoft did the same. And with 2.0 it feels like we've successfully escaped our little island. There's no going back, and it's clear sailing from here.</p>

<p>With <a href="https://blogs.msdn.microsoft.com/dotnet/2017/08/14/announcing-net-core-2-0/">.NET Core 2.0</a> Microsoft has managed to deliver on both <strong>platform</strong> and <strong>tooling</strong> at the same time, and deliver big.</p>

<h4 id="platform">Platform</h4>

<p>But let's start small and mention that ARM32 support is now available for Linux and Windows. So if you've been wanting to write C# for your Raspberry Pi IoT device, have at it.</p>

<h5 id="raspberrypi">Raspberry Pi</h5>

<p>The Windows option is <a href="https://developer.microsoft.com/en-us/windows/iot/GetStarted">Windows 10 IoT Core</a> Edition. And for Linux there are multiple distributions <a href="https://www.raspberrypi.org/downloads/">available</a>.</p>

<h5 id="netstandard">.NET Standard</h5>

<p>.NET Standard 2.0 is supported on the following platforms:</p>

<ul>
<li>.NET Framework 4.6.1</li>
<li>.NET Core 2.0</li>
<li>Mono 5.4</li>
<li>Xamarin.iOS 10.14</li>
<li>Xamarin.Mac 3.8</li>
<li>Xamarin.Android 7.5</li>
</ul>

<p>So .NET Standard 2.0 is now the way to do portable libraries. The Portable Class Library (PCL) approach <a href="https://blogs.msdn.microsoft.com/dotnet/2017/08/14/announcing-net-standard-2-0/#user-content-what-about-portable-class-libraries">has been deprecated</a>.</p>

<h5 id="cloud">Cloud</h5>

<p>Microsoft has already added support for .NET Core 2.0 to Azure Web Apps in all Azure regions.</p>

<h5 id="performance">Performance</h5>

<p>There's an overall performance improvement of around 20% in .NET Core 2.0 due to several factors such as <a href="https://blogs.msdn.microsoft.com/dotnet/2017/06/29/performance-improvements-in-ryujit-in-net-core-and-net-framework/">RyuJIT</a> and <a href="https://blogs.msdn.microsoft.com/dotnet/2017/07/20/profile-guided-optimization-in-net-core-2-0/">PGO</a>.</p>

<h4 id="tooling">Tooling</h4>

<p>There are so many improvements in the area of Visual Studio tooling that it really deserves its own blog post, so stay tuned for that.</p>

<h4 id="wrapup">Wrap Up</h4>

<p>If you're a .NET developer like me, perhaps "your ship has finally come in"...</p>

<div style="margin-bottom:15px"><iframe width="853" height="480" src="https://www.youtube.com/embed/TA1T6KwJyko" frameborder="0" allowfullscreen></iframe></div>

<p>in the form of <a href="https://blogs.msdn.microsoft.com/dotnet/2017/08/14/announcing-net-core-2-0/">.NET Core 2.0</a>.</p>

<p>You've <em>got</em> to be impressed with the commitment and coordination required to push all of this at the same time:</p>

<ul>
<li>3 Visual Studio Flavors (Windows, Mac, VSCode)</li>
<li>.NET Core</li>
<li>.NET Standard</li>
<li>ASP.NET Core</li>
<li>Entity Framework Core</li>
<li>Azure</li>
</ul>

<p>Quite an accomplishment for Microsoft and a terrific sign for those who make a living working on and deploying systems on the Microsoft stack!</p>]]></content:encoded></item><item><title><![CDATA[Contributing to Open Source Projects on Github]]></title><description><![CDATA[<p>So you're elated that someone else has already solved a large measure of the problem that you're grappling with in your website or app by posting their code on Github. And now that you've pulled their widget into your system you have identified some improvements that could be made.</p>

<h3 id="pullrequests">Pull</h3>]]></description><link>https://stevelathrop.net/contributing-to-open-source-projects-on-github/</link><guid isPermaLink="false">10cabbab-6341-47f8-858a-6cdce838b4cf</guid><dc:creator><![CDATA[Steve Lathrop]]></dc:creator><pubDate>Sun, 22 Jan 2017 18:46:17 GMT</pubDate><content:encoded><![CDATA[<p>So you're elated that someone else has already solved a large measure of the problem that you're grappling with in your website or app by posting their code on Github. And now that you've pulled their widget into your system you have identified some improvements that could be made.</p>

<h3 id="pullrequests">Pull Requests</h3>

<p>So now you're ready to make your first "Pull Request" and give back. You want everyone to benefit from the bug fix or new feature you've come up with for this widget.</p>

<p>And just as importantly, you'd like to stay up-to-date with everyone <strong>else's</strong> improvements to the source code repository as well.</p>

<h4 id="gettingthelatestupstreamchanges">Getting the Latest Upstream Changes</h4>

<p>To do this, make sure that you have added an <strong>upstream remote</strong> to your fork. The command for that is:</p>

<pre><code>git remote add upstream https://github.com/REPO/REPO.git  
</code></pre>

<p>Now whenever you want the latest changes added to your local branch just run these two commands:</p>

<pre><code># Fetch any new changes
git fetch upstream

# Merge any changes fetched into your working files
git merge upstream/master  
</code></pre>]]></content:encoded></item><item><title><![CDATA[Use JavaScript to Write SQL Queries and Schema Migrations With Knex.js]]></title><description><![CDATA[<p>So you're tired of writing all manner of SQL scripts in dialects such as <a href="https://en.wikipedia.org/wiki/Transact-SQL">T-SQL</a>, <a href="https://en.wikipedia.org/wiki/PL/SQL">PL/SQL</a>, <a href="https://en.wikipedia.org/wiki/PL/pgSQL">PL/pgSQL</a>, and <a href="https://en.wikipedia.org/wiki/SQL#Procedural_extensions">others, and various flavors</a> of the SQL standard itself.</p>

<p>Maybe you really need to support multiple SQL databases right now, or you want to write code that has some hope</p>]]></description><link>https://stevelathrop.net/use-javascript-to-write-sql-with-knex-js/</link><guid isPermaLink="false">ccbe6df3-9193-44c5-8c6f-034c9b289889</guid><category><![CDATA[Node.js]]></category><category><![CDATA[JavaScript]]></category><category><![CDATA[Ghost]]></category><category><![CDATA[Knex.js]]></category><dc:creator><![CDATA[Steve Lathrop]]></dc:creator><pubDate>Thu, 07 Jul 2016 02:00:00 GMT</pubDate><content:encoded><![CDATA[<p>So you're tired of writing all manner of SQL scripts in dialects such as <a href="https://en.wikipedia.org/wiki/Transact-SQL">T-SQL</a>, <a href="https://en.wikipedia.org/wiki/PL/SQL">PL/SQL</a>, <a href="https://en.wikipedia.org/wiki/PL/pgSQL">PL/pgSQL</a>, and <a href="https://en.wikipedia.org/wiki/SQL#Procedural_extensions">others, and various flavors</a> of the SQL standard itself.</p>

<p>Maybe you really need to support multiple SQL databases right now, or you want to write code that has some hope of being database-agnostic in the future.</p>

<p>And you don't particularly like the idea of an object-relational mapper (ORM) layer, at least for now.</p>

<p>If any of those criteria seem to fit your thinking, and you'd like to write your database queries and data seeding (DML) and/or schema migrations (DDL) in JavaScript, CoffeeScript, or TypeScript, then <a href="http://knexjs.org/">Knex.js</a> is worth trying! </p>

<p><center><a href="http://knexjs.org/"><img src="https://stevelathrop.net/content/images/2016/07/KnexLogo.png" alt="Knex.js Logo" title=""></a></center></p>

<h4 id="whousesit">Who Uses It?</h4>

<p>You'll be in pretty good company if you use Knex.js (or its sister project <a href="http://bookshelfjs.org/">Bookshelf.js</a> that provides an ORM layer). For example, the <a href="https://stevelathrop.net/tag/ghost-tag/">Ghost Blogging platform</a> uses Bookshelf.js.</p>

<h4 id="training">Training</h4>

<p>You can visit the <a href="http://knexjs.org/">Knex.js site</a> to learn more, but I also <strong>highly recommend</strong> the Pluralsight course <a href="https://www.pluralsight.com/courses/nodejs-data-access-using-knex">"Data Access in Node.js Using Knex"</a> by <a href="https://github.com/csaloio">Carlos Saloio</a>.</p>

<h4 id="moreinfo">More Info</h4>

<p>I also plan to write a number of additional blog posts about Knex.js myself, with a particular focus on TypeScript, so stay tuned!</p>]]></content:encoded></item><item><title><![CDATA[How to Setup a Simple Twilio Auto-responder]]></title><description><![CDATA[<p>So you're using <a href="https://www.twilio.com/sms">Twilio for Programmable SMS</a> and you want a simple, fixed auto-responder message to be sent when any of your SMS recipients replies to your original message. How to do it?</p>

<p>Well, first you should know that Twilio will automatically handle replies that include words like <code>STOP</code>, so</p>]]></description><link>https://stevelathrop.net/setup-a-simple-twilio-auto-responder/</link><guid isPermaLink="false">72d632f5-8345-43fb-bbc5-0def3a5c75ed</guid><category><![CDATA[Twilio]]></category><dc:creator><![CDATA[Steve Lathrop]]></dc:creator><pubDate>Tue, 05 Jul 2016 18:06:59 GMT</pubDate><content:encoded><![CDATA[<p>So you're using <a href="https://www.twilio.com/sms">Twilio for Programmable SMS</a> and you want a simple, fixed auto-responder message to be sent when any of your SMS recipients replies to your original message. How to do it?</p>

<p>Well, first you should know that Twilio will automatically handle replies that include words like <code>STOP</code>, so you don't need to worry about that.</p>

<p>But for other replies, perhaps you want a simple message informing your recipient that it's best to contact you via email.</p>

<p>To do this, the key concept to understand is that Twilio wants to read your message text from some public URL that you configure. So you'll need a public folder on a website that you control, and you'll need to place your message in an XML file in that folder. Twilio will make HTTP requests to your site just like a regular web browser.</p>

<p>For example, here's a sample <code>Reply.xml</code> file.</p>

<pre><code>&lt;?xml version="1.0" encoding="UTF-8"?&gt;  
&lt;Response&gt;  
  &lt;Sms&gt;
      Thanks for replying. The best way to contact us is
      by email at you@yourdomain.com.
  &lt;/Sms&gt;
&lt;/Response&gt;  
</code></pre>

<p>Let's say that you have placed this file on a website at the following URL: <a href="http://www.yourdomain.com/public/Reply.xml">http://www.yourdomain.com/public/Reply.xml</a>. Now, you need to tell Twilio about this URL. You can do this in one of two ways: (1) Directly on an individual sender telephone number, or (2) Using a group of sender numbers under a "Messaging Copilot" service.</p>

<p>I'll describe the more flexible "Copilot" configuration, available from your Twilio account at <a href="https://www.twilio.com/console/sms/services">https://www.twilio.com/console/sms/services</a>.</p>

<p><img src="https://stevelathrop.net/content/images/2016/07/screenshot-twilio-002.png" alt=""></p>

<p>Click on the big red button to add a new messaging service. You'll want to configure the "Request URL" setting to point to the URL for your <code>Reply.xml</code> file. In this example: <a href="http://www.yourdomain.com/public/Reply.xml">http://www.yourdomain.com/public/Reply.xml</a>. Be sure to specify the correct protocol: <code>http</code> or <code>https</code>.</p>

<p><img src="https://stevelathrop.net/content/images/2016/07/screenshot-twilio-004.png" alt=""></p>

<p>That's it. As long as Twilio can access the <code>Reply.xml</code> file you should now see your auto-response message when replying to SMSs.</p>

<p>It's beyond the scope of this post, but it's also possible to dynamically emit response messages based on context, etc. To do this, you can <a href="https://www.twilio.com/docs/api/twiml/sms/twilio_request">include parameters and values</a> in the configured URLs. Twilio sends this data to your site so that you can act upon it before responding.</p>]]></content:encoded></item><item><title><![CDATA[Ghost Turns 3 Years Old, Moves to Singapore]]></title><description><![CDATA[<p>Last year around this time I <a href="https://stevelathrop.net/happy-birthday-ghost-2yrs">wrote a post to celebrate the Ghost Blogging system's 2nd Birthday</a>. And now it's time to wish Ghost a Happy Birthday again.</p>

<p><center><a href="https://blog.ghost.org/year-3/"><img src="https://stevelathrop.net/content/images/2016/05/ghost3.jpg" alt="Ghost is 3!" title=""></a></center></p>

<p>Reading the <a href="https://blog.ghost.org/year-3/">3rd Birthday Post</a> from John O'Nolan, a few things stand out.</p>

<h4 id="payingdowndebt">Paying Down Debt</h4>

<p>Ghost's 3rd year was largely about</p>]]></description><link>https://stevelathrop.net/ghost-turns-3-moves-to-singapore/</link><guid isPermaLink="false">fe3e7b70-5781-4d3b-a94d-08f678b3df18</guid><category><![CDATA[Ghost]]></category><category><![CDATA[Blogging]]></category><category><![CDATA[Node.js]]></category><dc:creator><![CDATA[Steve Lathrop]]></dc:creator><pubDate>Wed, 04 May 2016 18:03:00 GMT</pubDate><media:content url="http://stevelathrop.net/content/images/2016/05/ghost3.jpg" medium="image"/><content:encoded><![CDATA[<img src="http://stevelathrop.net/content/images/2016/05/ghost3.jpg" alt="Ghost Turns 3 Years Old, Moves to Singapore"><p>Last year around this time I <a href="https://stevelathrop.net/happy-birthday-ghost-2yrs">wrote a post to celebrate the Ghost Blogging system's 2nd Birthday</a>. And now it's time to wish Ghost a Happy Birthday again.</p>

<p><center><a href="https://blog.ghost.org/year-3/"><img src="https://stevelathrop.net/content/images/2016/05/ghost3.jpg" alt="Ghost Turns 3 Years Old, Moves to Singapore" title=""></a></center></p>

<p>Reading the <a href="https://blog.ghost.org/year-3/">3rd Birthday Post</a> from John O'Nolan, a few things stand out.</p>

<h4 id="payingdowndebt">Paying Down Debt</h4>

<p>Ghost's 3rd year was largely about paying down technical debt and preparing a solid foundation for great things to come. A migration to <a href="http://digitalocean.com/?refcode=ca3f07c969d8">DigitalOcean</a> also occupied a lot of time I'm sure. The new <em>virtual</em> home for Ghost is excellent, and the <a href="https://www.digitalocean.com/customers/ghost/">dev ops choices made</a> are a great example for others to follow. These moves are sure to pay off in year 4.</p>

<h4 id="businessmove">Business Move</h4>

<p>Keeping with the theme of laying a solid foundation for the business, the decision was made to <a href="https://blog.ghost.org/moving-to-singapore/">re-domicile in Singapore</a>. Again, the foresight to make this move now and provide a solid <em>physical</em> home for Ghost as a business is going to benefit everyone who uses the Ghost platform.</p>

<h4 id="desktoptools">Desktop Tools</h4>

<p>Finally, the new desktop blogging tools introduced recently should provide some highly-desirable features in the future, such as an offline blogging experience.</p>

<h4 id="year4">Year 4</h4>

<p>Here's to a great year 4 ahead for Ghost!</p>]]></content:encoded></item><item><title><![CDATA[Meteor 1.3: ES2015 Modules, Mobile Improvements]]></title><description><![CDATA[<p>I've been following the development of the Meteor platform for a number of years, and it is very interesting to see some of the recent changes.</p>

<p><a href="https://www.meteor.com/" title="Meteor Home"><img src="https://stevelathrop.net/content/images/2016/07/MeteorLogoRed.jpg" alt="MeteorLogo" title=""></a></p>

<p>The JavaScript community has in some ways caught up to or overtaken Meteor, so the Meteor folks have had to adjust.</p>

<p><a href="http://info.meteor.com/blog/announcing-meteor-1.3">Meteor 1.3</a></p>]]></description><link>https://stevelathrop.net/meteor-1-3-es2015-modules-mobile-improvements/</link><guid isPermaLink="false">52e79ebf-47de-4543-a89e-4f8962c6cdd9</guid><category><![CDATA[Meteor]]></category><category><![CDATA[Node.js]]></category><category><![CDATA[MongoDB]]></category><category><![CDATA[JavaScript]]></category><dc:creator><![CDATA[Steve Lathrop]]></dc:creator><pubDate>Thu, 03 Mar 2016 14:25:00 GMT</pubDate><content:encoded><![CDATA[<p>I've been following the development of the Meteor platform for a number of years, and it is very interesting to see some of the recent changes.</p>

<p><a href="https://www.meteor.com/" title="Meteor Home"><img src="https://stevelathrop.net/content/images/2016/07/MeteorLogoRed.jpg" alt="MeteorLogo" title=""></a></p>

<p>The JavaScript community has in some ways caught up to or overtaken Meteor, so the Meteor folks have had to adjust.</p>

<p><a href="http://info.meteor.com/blog/announcing-meteor-1.3">Meteor 1.3</a> is the first release that really manifests the adjustments that are being made.</p>

<h4 id="es2015modules">ES2015 Modules</h4>

<p>The big ticket item in this release is support for ES2015 (aka ES6) Modules. While <a href="https://stevelathrop.net/meteor-1-2-released/">Meteor 1.2 added support for ES2015 syntax</a>, full support for modules was not included.</p>

<p>This move reflects the way that the JavaScript language itself has caught up to the once unique Meteor Package concept. And it acknowledges the popularity of npm as Meteor moves toward full support for and use of npm.</p>

<h4 id="mobilesupportviacordovagreatlyimproved">Mobile Support via Cordova Greatly Improved</h4>

<p>Meteor 1.3 now uses the latest version of Cordova (6.0.0) and on iOS now supports WKWebView for dramatic performance improvements. Hot code push for mobile apps is also greatly improved:</p>

<blockquote>
  <p>...wrappers detect and handle faulty JavaScript application code. It is now possible to recover from hot code pushes of broken code that previously required an application reinstall. The hot code push system is also much faster and uses incremental and resumable updates to save network bandwidth and battery life. </p>
</blockquote>

<h4 id="onthehorizon">On the Horizon</h4>

<p>Next up in Meteor 1.4 we expect to see an upgrade of Meteor to Node.js version 4 and MongoDB 3.2. I'm looking forward to those improvements!</p>]]></content:encoded></item><item><title><![CDATA[Meteor 1.2 Released: ES2015, View Engines, Performance]]></title><description><![CDATA[<p>The Meteor Team <a href="http://info.meteor.com/blog/announcing-meteor-1.2">announced the release of Meteor 1.2</a> this week. Detailed release notes are available <a href="https://github.com/meteor/meteor/blob/devel/History.md#v12-2015-sept-21">here</a>.</p>

<p>This is an important release, and it really emphasizes what Meteor can offer as a complete <em>development platform</em> based on JavaScript.</p>

<p><center><a href="https://www.youtube.com/watch?v=8G2SMVIUNNk&amp;feature=youtu.be&amp;t=1535"><img src="https://stevelathrop.net/content/images/2015/09/Meteor-Platform-Youtube.png" alt="" title=""></a></center></p>

<p>For example, view engine frameworks such as <a href="https://www.meteor.com/tutorials/angular/creating-an-app">Angular</a> and <a href="https://www.meteor.com/tutorials/react/creating-an-app">React</a> are</p>]]></description><link>https://stevelathrop.net/meteor-1-2-released/</link><guid isPermaLink="false">8f75b0c1-8431-4474-b5d1-7532ebd4f1b1</guid><category><![CDATA[Meteor]]></category><category><![CDATA[JavaScript]]></category><dc:creator><![CDATA[Steve Lathrop]]></dc:creator><pubDate>Fri, 25 Sep 2015 17:32:00 GMT</pubDate><content:encoded><![CDATA[<p>The Meteor Team <a href="http://info.meteor.com/blog/announcing-meteor-1.2">announced the release of Meteor 1.2</a> this week. Detailed release notes are available <a href="https://github.com/meteor/meteor/blob/devel/History.md#v12-2015-sept-21">here</a>.</p>

<p>This is an important release, and it really emphasizes what Meteor can offer as a complete <em>development platform</em> based on JavaScript.</p>

<p><center><a href="https://www.youtube.com/watch?v=8G2SMVIUNNk&amp;feature=youtu.be&amp;t=1535"><img src="https://stevelathrop.net/content/images/2015/09/Meteor-Platform-Youtube.png" alt="" title=""></a></center></p>

<p>For example, view engine frameworks such as <a href="https://www.meteor.com/tutorials/angular/creating-an-app">Angular</a> and <a href="https://www.meteor.com/tutorials/react/creating-an-app">React</a> are now supported on the Meteor platform.</p>

<p>And the latest incarnation of JavaScript itself, ES2015, is fully supported across all JavaScript code written for the platform (from browser to server).</p>

<p>Think about that for a moment. Meteor isn't just a web framework, it's an entire platform providing code compilation and optimization. Otherwise, how would it be able to support ES2015 across both client and server?</p>

<p>As the Meteor folks said in the announcement:</p>

<blockquote>
  <p>...we think anyone writing an app in JavaScript should be using ES2015.  And it's going to drive a lot of new JavaScript adoption from developers used to other languages.  So we're all in on it: the <a href="https://www.meteor.com/tutorials/blaze/creating-an-app">Meteor tutorial</a> and a growing fraction of Meteor core is now pure ES2015.  We've found that it lets us write <strong>dramatically more concise and readable application code</strong>, thanks to built-in support for classes, block variable scoping, arrow functions, template strings, and numerous other improvements to the language.  To learn more about ES2015 and how valuable we've found it in our own work, watch <a href="https://www.youtube.com/watch?v=05Z6YGiZKmE">Ben Newman's recent Devshop talk</a>.</p>
</blockquote>

<p>And as a platform, Meteor has improved some internal performance factors as well in version 1.2. For example, the <a href="https://www.meteor.com/ddp">distributed data protocol (DDP)</a> used by Meteor to send data (e.g., an Episode document) down to the browser now uses compression to reduce on-the-wire bandwidth. And on the server, Meteor has <a href="https://github.com/meteor/meteor/pull/4694">improved MongoDB oplog tailing logic</a> to reduce the overhead of notifying clients of large numbers of writes.</p>

<p><span style="display:none"> <br>
If you've been reading <a href="https://stevelathrop.net/tag/meteor">my other posts about Meteor</a>, the improvements in version 1.2 offer all the more reason for you to check out Meteor for yourself. <br>
</span></p>]]></content:encoded></item><item><title><![CDATA[MongoDB Dev/DBA Tools: MTools]]></title><description><![CDATA[<p>During "Rapid Start" engagements with folks from MongoDB Inc., some sample MongoDB documents with "fake" data are often generated using a handy little set of utilities called <strong><a href="https://github.com/rueckstiess/mtools">mtools</a></strong>.</p>

<p><center><a href="https://github.com/rueckstiess/mtools"><img src="https://stevelathrop.net/content/images/2015/09/mtools.png" alt="" title=""></a></center></p>

<p>This type of utility is very useful, particularly for generating fake data for testing data validation logic. The utility was developed by</p>]]></description><link>https://stevelathrop.net/mongodb-dev-dba-tools-mtools/</link><guid isPermaLink="false">f18e979a-3242-477c-a497-21df1b6330d5</guid><category><![CDATA[MongoDB]]></category><category><![CDATA[Testing]]></category><dc:creator><![CDATA[Steve Lathrop]]></dc:creator><pubDate>Mon, 21 Sep 2015 14:20:10 GMT</pubDate><content:encoded><![CDATA[<p>During "Rapid Start" engagements with folks from MongoDB Inc., some sample MongoDB documents with "fake" data are often generated using a handy little set of utilities called <strong><a href="https://github.com/rueckstiess/mtools">mtools</a></strong>.</p>

<p><center><a href="https://github.com/rueckstiess/mtools"><img src="https://stevelathrop.net/content/images/2015/09/mtools.png" alt="" title=""></a></center></p>

<p>This type of utility is very useful, particularly for generating fake data for testing data validation logic. The utility was developed by an engineer out of the MongoDB Inc. New York City office.</p>

<p>A nice summary report from the "Rapid Start" sessions is typically delivered to the client, and a brief mention of the <strong>mtools</strong> fake data generator usually appears on the final pages of the PDF, including the fake data "template" used to generate documents. This template could easily be adapted for data validation testing.</p>]]></content:encoded></item><item><title><![CDATA[MongoDB Indexing 101: Compound Indexes]]></title><description><![CDATA[<p>When you license MongoDB, you typically kick-off your MongoDB relationship with the <a href="https://webassets.mongodb.com/MongoDB_RapidStart_Datasheet.pdf">"Rapid Start"</a> on-site training and consulting provided by folks from MongoDB Inc.</p>

<p><center><img src="https://stevelathrop.net/content/images/2015/09/mongodb-index-for-sort.png" alt="" title=""></center></p>

<p>One piece of technical information presented on the topic of Indexing makes for a nice blog post that can be referred to again later. So let's</p>]]></description><link>https://stevelathrop.net/mongodb-indexing-101/</link><guid isPermaLink="false">abfdce8c-de76-4eac-a630-c09ca462c6d4</guid><category><![CDATA[MongoDB]]></category><category><![CDATA[NoSQL: Getting Started]]></category><dc:creator><![CDATA[Steve Lathrop]]></dc:creator><pubDate>Wed, 09 Sep 2015 15:35:16 GMT</pubDate><content:encoded><![CDATA[<p>When you license MongoDB, you typically kick-off your MongoDB relationship with the <a href="https://webassets.mongodb.com/MongoDB_RapidStart_Datasheet.pdf">"Rapid Start"</a> on-site training and consulting provided by folks from MongoDB Inc.</p>

<p><center><img src="https://stevelathrop.net/content/images/2015/09/mongodb-index-for-sort.png" alt="" title=""></center></p>

<p>One piece of technical information presented on the topic of Indexing makes for a nice blog post that can be referred to again later. So let's jump right in.</p>

<p><a href="http://docs.mongodb.org/manual/core/index-compound/">Compound Indexes</a> reference multiple fields from a collection in a single index structure. This is probably the most helpful type of index to create for optimizing a particular query, such as a query that searches for Customers.</p>

<p>The developer or DBA creating a compound index must determine the optimal field order within the index. The concept is to let selectivity drive the order of the fields.</p>

<ul>
<li>Order fields in a compound index from most selective to least selective</li>
<li>Usually, this means equality fields (fields queried for a specific value) before range fields (e.g., date fields queried for a particular time period)</li>
<li>When dealing with multiple equality values, start with the most selective</li>
</ul>

<p>The desired sort order of the query results must be considered as well. And when we consider sorting, we can summarize the general rules of thumb for the optimal compound index as follows.</p>

<ul>
<li>Equality before range</li>
<li>Equality before sorting</li>
<li>Sorting before range</li>
</ul>]]></content:encoded></item><item><title><![CDATA[MongoDB Bitwise Queries in Version 3.2]]></title><description><![CDATA[<p>As a follow-on to <a href="https://stevelathrop.net/mongodb-new-features-in-version-3-2">my recent post about the upcoming features in MongoDB Version 3.2</a>, I wanted to mention a specific feature that hasn't made headlines, but is nonetheless quite interesting to developers: <a href="https://jira.mongodb.org/browse/SERVER-3518">Bitwise Queries</a>.</p>

<p><center><img src="https://stevelathrop.net/content/images/2015/06/mongodb-3-2.png" alt="" title=""></center></p>

<p>MongoDB has supported <a href="http://docs.mongodb.org/manual/reference/operator/update/bit/#up._S_bit">bitwise update operations</a> for some time. But if you wanted to</p>]]></description><link>https://stevelathrop.net/mongodb-bitwise-queries-in-version-3-2/</link><guid isPermaLink="false">0f9ec980-df6f-4b93-a0ab-f35cbba351c6</guid><category><![CDATA[MongoDB]]></category><category><![CDATA[Validations]]></category><dc:creator><![CDATA[Steve Lathrop]]></dc:creator><pubDate>Tue, 01 Sep 2015 18:55:33 GMT</pubDate><content:encoded><![CDATA[<p>As a follow-on to <a href="https://stevelathrop.net/mongodb-new-features-in-version-3-2">my recent post about the upcoming features in MongoDB Version 3.2</a>, I wanted to mention a specific feature that hasn't made headlines, but is nonetheless quite interesting to developers: <a href="https://jira.mongodb.org/browse/SERVER-3518">Bitwise Queries</a>.</p>

<p><center><img src="https://stevelathrop.net/content/images/2015/06/mongodb-3-2.png" alt="" title=""></center></p>

<p>MongoDB has supported <a href="http://docs.mongodb.org/manual/reference/operator/update/bit/#up._S_bit">bitwise update operations</a> for some time. But if you wanted to find or atomically find-and-modify documents with fields matching a certain bit pattern you were out of luck. But now, as of <a href="http://blog.mongodb.org/post/124686547743/mongodb-316-is-released">Developer Preview 3.1.6</a>, <a href="https://jira.mongodb.org/browse/SERVER-3518">bitwise query support</a> is built-in.</p>

<h4 id="example">Example</h4>

<p>For data collection systems with many multi-select lists and checkboxes to deal with, this could be a particularly useful feature. You may be storing these lists as an array using (at least) 4 bytes (int32) to save each and every selection made by the user. With a multi-select offering, say, 30 options/checkboxes, if the user checks all of them that's 120 bytes.</p>

<p>By storing this information in a bitwise fashion in a single, 32-bit value, for example, the space occupied by the element would be 4 bytes, regardless of which checkboxes are checked by the user.</p>

<p>Yes, this savings of 116 bytes seems negligible. But if we saved this over 20 fields per document, and 500 documents per quarter, and 1000 users per quarter, it adds up! In this example, it adds up to a savings of about 1GB per quarter, per database by my calculations. If you had 25 such databases, that's roughly 100GB in space savings per year.</p>

<h4 id="bitwisestoragecanimproveoverallperformance">Bitwise Storage Can Improve Overall Performance</h4>

<p>To quote a <a href="https://www.mongodb.com/blog/post/new-compression-options-mongodb-30">MongoDB blog post regarding data compression</a>:</p>

<blockquote>
  <p>Size is one factor, and there are others. Disk I/O latency is dominated by seek time on rotational storage. By decreasing the size of the data, fewer disk seeks will be necessary to retrieve a given quantity of data, and disk I/O throughput will improve. In terms of RAM, some compressed formats can be used without decompressing the data in memory. In these cases more data can fit in RAM, which improves performance. </p>
</blockquote>

<p>So there's more to be gained by implementing bitwise data storage schemes than simply the cost savings on storage media.</p>

<p>Another subtle implication of this bitwise approach is that updates to these types of elements in a given document <a href="http://blog.mongodb.org/post/248614779/fast-updates-with-mongodb-update-in-place">can occur "in-place"</a> without expanding the size of the document.</p>]]></content:encoded></item><item><title><![CDATA[Use Git to Work With TFS]]></title><description><![CDATA[<p>Once you start using <a href="https://en.wikipedia.org/wiki/Git_%28software%29">Git</a> for version control you never want to go back to the old centralized version control model of tools like TFS. But what if you are stuck with TFS for some reason and can't move your code to a Git repo?</p>

<p><center><img src="https://stevelathrop.net/content/images/2015/08/git-logo.png" alt="" title=""></center></p>

<h4 id="gittftotherescue">Git-TF to the Rescue</h4>

<p>With</p>]]></description><link>https://stevelathrop.net/use-git-to-work-with-tfs/</link><guid isPermaLink="false">fbbaca18-da69-478c-a473-e0543a040170</guid><category><![CDATA[TFS]]></category><category><![CDATA[Git]]></category><dc:creator><![CDATA[Steve Lathrop]]></dc:creator><pubDate>Mon, 10 Aug 2015 15:22:31 GMT</pubDate><content:encoded><![CDATA[<p>Once you start using <a href="https://en.wikipedia.org/wiki/Git_%28software%29">Git</a> for version control you never want to go back to the old centralized version control model of tools like TFS. But what if you are stuck with TFS for some reason and can't move your code to a Git repo?</p>

<p><center><img src="https://stevelathrop.net/content/images/2015/08/git-logo.png" alt="" title=""></center></p>

<h4 id="gittftotherescue">Git-TF to the Rescue</h4>

<p>With the cross-platform <a href="https://gittf.codeplex.com/">Git-TF</a> tools, you can use Git to work with TFS. A couple simple commands provided by Git-TF allow you to push your Git commits as TFS check-ins, pull the latest code from TFS into your Git repo, etc.</p>

<p>One of the nice side benefits of syncing your Git repo with a TFS folder is that you can organize your code in whatever local folder structure you wish. With a typical TFS-based project, your local folders must match the structure defined in the central TFS location. But with Git-TF, you can hook up your local Git repo folder to whatever TFS central folder or sub-folder you wish to synch with.</p>

<h4 id="gittfworkflow">Git-TF Workflow</h4>

<p>Here are the Git-TF commands that you'll want to learn.</p>

<p>For initial setup of a Git repo based on a folder in TFS, run a command like the following.  </p>

<pre><code class="language-bash">git tf clone https://yourproject.visualstudio.com/DefaultCollection $/YourRepo/SubFolder repo  
</code></pre>

<p>That command should be run with your current working directory set to the parent folder for your new, local Git repo. And in this example, the subfolder <code>repo</code> is being created. The TFS folder to which you are mapping is indicated with the opening dollar sign, which in this example is <code>$/YourRepo/SubFolder</code>.</p>

<p>After that initial setup is out of the way, change your working directory into the subfolder you just created. For example:</p>

<pre><code class="language-bash">cd repo  
</code></pre>

<p>And now you may want to configure your TFS username and password so that you won't be prompted for them every time you issue a command.</p>

<pre><code class="language-bash">git config git-tf.server.username YourUsernameOrEmail  
</code></pre>

<pre><code class="language-bash">git config git-tf.server.password YourPassword  
</code></pre>

<p>Now you make your changes to the files in your local Git repo and commit them locally as needed.</p>

<pre><code class="language-bash">git add --all .  
git commit -am "commit one"  
</code></pre>

<pre><code class="language-bash">git add --all .  
git commit -am "commit two"  
</code></pre>

<p>Finally, you sync your local changes and any remote TFS changes made by others with these two commands:</p>

<pre><code class="language-bash">git tf pull --rebase  
git tf checkin  
</code></pre>

<p>That's it for the typical workflow. Rinse and repeat the last 4 commands (<code>add</code>, <code>commit</code>, <code>pull</code>, and <code>checkin</code>).</p>

<p>I'd recommend creating a batch script in your local repo folder (call it <code>tf-checkin.bat</code> for example) that prompts for your commit message and executes all of the commands. Here's an example:</p>

<pre><code class="language-bash">@echo off
SET /P CM=Enter Commit Message:  
git add --all .  
git commit -am "%CM%"  
git tf pull --rebase  
git tf checkin  
</code></pre>

<h4 id="tfsonlineandssl">TFS Online and SSL</h4>

<p>When syncing with a TFS Online project, Git-TF will be hitting an SSL connection. Under the hood, Git-TF uses Java, so you may need to configure Java to use your org's SSL certificate. Here's an example of the command to do that.</p>

<pre><code class="language-bash">"C:\Program Files\Java\jre1.8.0_51\bin\keytool" -import -keystore "C:\Program Files\Java\jre1.8.0_51\lib\security\cacerts" -alias startssl -file "M:\My Org Cert\MyOrg-CA.cer"
</code></pre>

<p>I was prompted to enter a password for the keystore. The default password is: <strong>changeit</strong></p>

<p>Note that the paths in the example command above will be different depending on the version of Java installed on your system. In the example above, the version <strong>1.8.0_51</strong> is referenced in two places.</p>

<h4 id="gitandssl">Git and SSL</h4>

<p>Git itself may also complain that you have a "self signed certificate in certificate chain". If that happens, run the following command too.</p>

<pre><code class="language-bash">git config --global http.sslVerify false  
</code></pre>

<h4 id="conclusion">Conclusion</h4>

<p>I think you'll find that investing a little time to learn Git and use Git-TF for code located in TFS will save you a lot of time and frustration down the road.</p>]]></content:encoded></item><item><title><![CDATA[MongoChef Cooks Up Tasty, Schemaless Treats]]></title><description><![CDATA[<p>I just came across a new (to me) developer/DBA tool for MongoDB called <a href="http://3t.io/mongochef">MongoChef</a>. And I'm finding that the Chef is quite talented.</p>

<h4 id="querybuilder">Query Builder</h4>

<p>The visual Query Builder is what people new to MongoDB (and partial to SQL syntax) will appreciate the most, because it eases you into</p>]]></description><link>https://stevelathrop.net/mongochef-cooks-up-tasty-treats/</link><guid isPermaLink="false">4544ee8d-1f4f-46d5-ba0a-28d76cfea3d9</guid><category><![CDATA[NoSQL: Getting Started]]></category><category><![CDATA[MongoDB]]></category><dc:creator><![CDATA[Steve Lathrop]]></dc:creator><pubDate>Fri, 17 Jul 2015 19:19:19 GMT</pubDate><content:encoded><![CDATA[<p>I just came across a new (to me) developer/DBA tool for MongoDB called <a href="http://3t.io/mongochef">MongoChef</a>. And I'm finding that the Chef is quite talented.</p>

<h4 id="querybuilder">Query Builder</h4>

<p>The visual Query Builder is what people new to MongoDB (and partial to SQL syntax) will appreciate the most, because it eases you into the JSON-based query language that is natively supported by MongoDB, making you productive with MongoDB immediately.</p>

<p>You can easily:</p>

<ol>
<li>Select a "table" (MongoDB Collection, as in the left-most pane in the screenshot below)  </li>
<li>Drag a field from the table-view of your MongoDB Collection (the middle pane below), and  </li>
<li>Drop the field into the Query Builder (the right-most pane) to build up your "where" clause and to specify sorting and projection (which fields should be included or excluded)</li>
</ol>

<p></p><p id="screenshot1" style="margin-bottom:-5px;font-size:small;color:darkgray">Screenshot 1 - MongoChef Query Builder</p><div style="padding-top:15px"><a href="https://stevelathrop.net/content/images/2015/07/MongoChefQueryBuilder.png"><img src="https://stevelathrop.net/content/images/2015/07/MongoChefQueryBuilder.png"></a></div><p></p>

<h4 id="mongodb3xsupport">MongoDB 3.x Support</h4>

<p>One of the shortcomings of some other offerings such as <a href="http://www.robomongo.org/">RoboMongo</a> is lack of support for the new <a href="http://docs.mongodb.org/manual/faq/storage/">storage engines</a> introduced in MongoDB 3.0.</p>

<p>MongoChef fully supports MongoDB 3.0, including the WiredTiger and <a href="https://stevelathrop.net/mongodb-in-memory-storage-engine-setup">In-memory</a> storage engines.</p>

<h4 id="licensing">Licensing</h4>

<p>MongoChef is offered for free for "non-commercial" use, which I take to mean that if you are simply using it to personally learn MongoDB, you are honoring the terms of the free license. Otherwise, MongoChef is licensed per-person (on as many machines as that person wishes to use) for $99.</p>

<h4 id="conclusion">Conclusion</h4>

<p><strong>MongoChef</strong> by <em>3T Software Labs</em> is a worthwile tool with many useful features. It comes recommended/sponsored by <a href="http://www.red-gate.com/products/">Red Gate Software</a>. Check it out by registering on the <a href="http://3t.io">3T website</a>.</p>]]></content:encoded></item><item><title><![CDATA[MongoDB: New Features on Tap in Version 3.2]]></title><description><![CDATA[<p>At their recent annual conference, <strong>MongoDB World 2015</strong>, <a href="https://www.mongodb.com/press/new-features-at-global-user-conference">MongoDB announced</a> a number of interesting new features slated for Version 3.2 of MongoDB.</p>

<p><center><img src="https://stevelathrop.net/content/images/2015/06/mongodb-3-2.png" alt="" title=""></center></p>

<p>Here's a summary of the new features mentioned in the announcement:</p>

<blockquote>
  <ul>
  <li><p><strong>Support for BI and Visualization Tools</strong>. A new connector for BI and visualization provides SQL-based access</p></li></ul></blockquote>]]></description><link>https://stevelathrop.net/mongodb-new-features-in-version-3-2/</link><guid isPermaLink="false">8753a5ec-495e-48b4-8feb-4389560ea7c9</guid><category><![CDATA[MongoDB]]></category><category><![CDATA[Validations]]></category><dc:creator><![CDATA[Steve Lathrop]]></dc:creator><pubDate>Thu, 18 Jun 2015 14:26:00 GMT</pubDate><content:encoded><![CDATA[<p>At their recent annual conference, <strong>MongoDB World 2015</strong>, <a href="https://www.mongodb.com/press/new-features-at-global-user-conference">MongoDB announced</a> a number of interesting new features slated for Version 3.2 of MongoDB.</p>

<p><center><img src="https://stevelathrop.net/content/images/2015/06/mongodb-3-2.png" alt="" title=""></center></p>

<p>Here's a summary of the new features mentioned in the announcement:</p>

<blockquote>
  <ul>
  <li><p><strong>Support for BI and Visualization Tools</strong>. A new connector for BI and visualization provides SQL-based access to MongoDB, opening up data from modern applications to the BI and Visualization tools used by 100s of millions of information workers, including Tableau, BusinessObjects, Qlik, and Cognos. The connector provides unprecedented performance and scalability by leveraging MongoDB’s powerful Aggregation Framework to process data within the database.  </p></li>
  <li><p><strong>Encryption for Data at Rest</strong>. Building on MongoDB’s extensive security features, including encryption for data on the wire, MongoDB 3.2 will add a new option for encrypting data at rest, with keys secured by the industry standard Key Management Interoperability Protocol (KMIP). At-rest encryption is especially critical for regulated industries such as healthcare, financial services, retailers and certain government agencies. Security is top of mind for CIOs today, and with encryption for data at rest organizations can address their stringent security requirements.  </p></li>
  <li><p><strong>Document Validation</strong>. For decades, organizations have relied on databases to play an integral role in their data governance strategy by ensuring the quality and integrity of their data. With MongoDB 3.2, document validation rules can be applied to one or more fields to verify data types, values and the presence of mandatory fields. By combining document validation with MongoDB’s dynamic schema, organizations will have enormous flexibility in defining rules that ensure data quality while simplifying application development.  </p></li>
  <li><p><strong>Dynamic Lookups</strong>. A key to enabling access to BI and Visualization tools is the ability to combine data across collections. Lookups provide greater flexibility for modeling data and add to the wide range of use cases for which MongoDB is well suited. With MongoDB 3.2, lookups will be available as part of the Aggregation Framework.  </p></li>
  <li><p><strong>Schema Visualization</strong>. A new graphical interface to MongoDB, code named mongoScout, will provide developers and DBAs a powerful, intuitive interface for exploring and understanding their data structure. In addition to displaying individual documents, mongoScout analyzes collections to visualize the existence of fields and the cardinality of their values. DBAs and developers can use mongoScout to make intelligent decisions about indexes and validation rules.</p></li>
  </ul>
</blockquote>

<p>I haven't been able to find any further details related to the <strong>Document Validation</strong> feature. I'm curious to know how this will be implemented.</p>

<p>A follow-up announcement will be released soon with information about the availability of an advanced beta program for previewing these new features before their full release, which is slated for Q4 2015.</p>]]></content:encoded></item><item><title><![CDATA[Write JavaScript in C#]]></title><description><![CDATA[<p>There's an interesting little project out there in beta now called <a href="http://duoco.de/">DuoCode</a> that is basically a <a href="http://en.wikipedia.org/wiki/Source-to-source_compiler">transpiler</a> (though they wrongly dub it a <a href="http://en.wikipedia.org/wiki/Cross_compiler">cross-compiler</a>) from C# to JavaScript. It provides a projection of the browser DOM into the C# world so that when the resulting JavaScript is executed, the normal</p>]]></description><link>https://stevelathrop.net/write-javascript-in-csharp/</link><guid isPermaLink="false">6cf6a2a3-0b5c-442d-857b-b47e79ade9fa</guid><category><![CDATA[JavaScript]]></category><category><![CDATA[C#]]></category><category><![CDATA[Visual Studio]]></category><dc:creator><![CDATA[Steve Lathrop]]></dc:creator><pubDate>Fri, 29 May 2015 16:45:00 GMT</pubDate><content:encoded><![CDATA[<p>There's an interesting little project out there in beta now called <a href="http://duoco.de/">DuoCode</a> that is basically a <a href="http://en.wikipedia.org/wiki/Source-to-source_compiler">transpiler</a> (though they wrongly dub it a <a href="http://en.wikipedia.org/wiki/Cross_compiler">cross-compiler</a>) from C# to JavaScript. It provides a projection of the browser DOM into the C# world so that when the resulting JavaScript is executed, the normal DOM interactions can occur.</p>

<p><center><a href="http://duoco.de/"><img src="https://stevelathrop.net/content/images/2015/05/duocode-logo.png" alt="duocode" title=""></a></center></p>

<p>I think that people who haven't learned yet how to write modular or object-oriented JavaScript, but who are fluent in C#, will appreciate a tool like this. But this is a commercial tool with as-yet unspecified pricing, which will certainly dampen enthusiasm.</p>

<p>I've tried other <a href="http://en.wikipedia.org/wiki/CoffeeScript">transpilers such as CoffeeScript</a>, and I can't say that I'm really a fan of them, especially when the two languages in question are very similar. I do very much like the idea of <a href="http://en.wikipedia.org/wiki/Syntactic_sugar">simpler syntax and "sugar"</a> that transpiled languages offer, but I find that there's a cognitive overhead that I can't get away from.</p>

<p>This cognitive overhead occurs because in my head I'm aware of the syntax of <em>both</em> languages and I'm often "mentally transpiling" and thinking about what the output will be when I write the to-be-transpiled code.</p>

<p>So when it comes to JavaScript and transpiling to it from CoffeeScript or C#, my take is that it is better to just learn and use the JavaScript syntax itself.</p>]]></content:encoded></item></channel></rss>