SQL Modeling at the IEEE

Thursday night I had the honor of giving my new talk “Software Modeling with ASCII, and no I’m not kidding” to the Columbus Computer society of the IEEE.  They were very welcoming and enjoyed the talk, and had a number of comments about the technology and its use.

I start the talk with what essentially be the first chapter of Professional Software Modeling.  WE cover the problems with current modeling systems, and the timeline for modeling and object/relational mapping

The bulk of the talk is effectively a demo of deploying a database and entity classes form a simple model and then generating an ASP.NET MVC web site from the model.  It is similar to the hands-on-lab at the PDC, with the Mini Nerd Dinner. 

Right off the bat, a listener pointed out that we HAVE these tools already.  Don’t we have XML?  What’s wrong with that?  Tools like WSDL and EDMX solve these problems, and are human readable! Why do we need something else?

I agreed in principle but I pointed out that not many people think that XML is not human readable any more.  The sample code was, but the 150,000 line long files that end up getting used are NOT.  Especially when there are only about 1,500 important lines in the file.

The XML is still there, I assured everyone, and the model was still in EDMX.  M is just a way to work with the model that is a little more succinct than the XML.  Additionally, I pointed out, there are more semantic pieces to M that we hadn’t gone over.  We had done the nouns, but there are verbs too, if you get my drift.

As we went over how an M model looks in the M file, and how the final database and domain classes look, someone asked the obvious question.  It’s the same question that I asked last fall.

“Great!  What do I do with an EXISTING application?”

I don’t have a good answer for that.  Would I like to be able to take an existing application’s database and look at the representative M file?  Yes.  Can I?  I am not sure I can.  That is an open question.

Honestly, I haven’t taken the time to look into the story for existing applications.  Fact is, most application development is adding features to existing applications and I don’t know how M fits into that.  If it is going to be a good modeling tool for existing application there needs to be a reverse-engineering story, and I hope there is.  I mean, you can always make a model from an existing database, but I am not sure that is enough.

The last discussion we had was about potential.  Specifically – what is Microsoft doing to support hte wide adoption of this product?  How about multiple datasources?  What if I need to model an identity system, where there is an Active Directory, an LDS and a database with identity information?  Can I model that in M?

No, I answered.  Right now M is shackled by EDMX, which really only has a provider for SQL Server and Oracle.  It is theoretically possible to enhance the M modeling superstructure to handle a multi-source database, but it isn’t done yet.

In general, everyone loved the talk, and I am looking forward to cleaning it up a bit and giving it a few more times.  Thanks to Jack Freund and the IEEE Computer Society for allowing me to speak!  Hope to see you all again sometime!

The Build Button

At Code and Coffee yesterday, Tim Wingfield suggested that I blog about my Build Button, so here it is.

A while back I got myself a Griffin Technologies PowerMate.  This device is designed as a multimedia controller.  Read: Volume knob.  It has six events:

  • Turn left
  • Turn right
  • Press
  • Press and hold
  • Press and turn left
  • Press and turn right

 I left the Turn left and turn right events violume for Media Player, but I set Press to be <CTRL> + <SHIFT> + B

That's right, build, baby.

So, when I get to finidh a method, I can just up and smack the button, and the project compiles.  It's quite an experience.  I used to have Press and hold set to <F5> but now I think I will have it run the unit tests since that is how I tend to develop these days.

Any, it's not a cheap thrill at $45, but I still think it is worth it.

By the by - i also have an Optimus Mini Three, which I recommend for the remarkably high geek factor.


Getting started with Identity Services

I find myself needing to write a federated identity proof of concept for a client of ICC.  I got started with three downloads:

I wanted to get a good foundation, so I started with the training kit.  As an author, I heavily recommend everyone do this.  The days when you could just jump in and start hacking are long gone.  There are frameworks on top of frameworks in today’s development environments and learning the right path is paramount.

Getting started with a lab

The lab I started with was Web Sites and Identity, becasue it solved the particular problem that I needed solved.  Your might be different.  The prerequisites included:

  • Microsoft® Windows® Vista SP2 (32-bits or 64-bits) , Microsoft® Windows Server 2008 SP2 (32-bit or 64-bit), Microsoft® Windows Server 2008 R2, Microsoft® Windows® 7 RTM (32-bits or 64-bits)
  • Microsoft® Internet Information Services (IIS) 7.0
  • Microsoft® .NET Framework 3.5
  • Microsoft® Visual Studio 2008
  • Microsoft® SQL Express 2005 (or later)
  • Microsoft® Windows Identity Foundation Runtime
  • Microsoft® Windows Identity Foundation SDK

The basics needed to be present, but things like Powershell permissions and IIS 7 configurations have built-in installers that ran easily ran from the dependency checker. 



You are then asked to install snippets for code and XML.  I put them in the My Snippets folder for Visual Studio 2008.


After installing a few certificates, the labs were set up and ready to go.

Working the lab

In working with the lab, it seems that the setup scripts failed to supply the SSL binding for the default web.  I learned a fix in this ScottGu post after making this post to IIS.net

to fix it you just need to go to IIS7 and do these steps:

  1. Select the Default Web Site
  2. Click Bindings… under Edit Site on the right hand command panel
  3. Click the https binding and click the Edit… button

  4. You’ll see that SSL Cert dropdown has No Binding Selected.  Change it to STSTestCert.

  5. Click OK and Close.

That’s all there is to it.  The site will no longer give you Cannot connect errors.

Anyway, I like the lab and I like the WIF.  Generally, it has the same problem as all of the W*F patterns that Microsoft provides.  It is configuration over convention and there are SO many options that it is confusing.  WIF tries to be everything to everyone.  To find the exact situation that suits your needs will require a little digging through the lab.

“The Application Cannot Start” in VS 2008 after install of VS 2010


I ran into a problem today where my Visual Studio 2008 install gave me the “The application cannot start” error after I had installed the RC of VS 2010.  After poking around, I found this post which helped me out a lot.

I had to copy these files from “C:\Program Files (x86)\Common Files\microsoft shared\MSEnv”:

  • dte80.olb
  • dte80a.olb
  • dte90.olb
  • dte90a.olb

and paste them into “C:\Program Files (x86)\Microsoft Visual Studio 9.0\Common7\IDE”

Then everything worked great.  No registration was needed.

Quite a week of martial arts.

I started the extended weekend timeframe with my normal martial arts club, the Columbus Ninjutsu Club.  Most regular readers probably know that I study there, and am a big fan of the art of Ninjutsu.  Thursday, we did Yoko Nage, one of my favorite throws.  It meant hitting the mat many, many times though – probably forty over the course of the class.  Then I did the first randori (full speed training) that I have done in months with friend and fellow ninja Adam-san.

I knew I would be sore the next day, so I drowned my sorrows in 800 milligrams of Advil and 32 ounces of Muscle Milk.

Friday, I was sore.

Saturday I was REALLY sore.

Sunday, I have a three hour seminar with Don Frye.  Don and Dan-sensei are working on a movie together, Apparitions: The Darkness, and had some filming to do in Michigan.  Also, the Arnold was this weekend, so Don was here for that.  In between the two, he held a seminar for us.  Nice guy!

Don Frye seminar

Learned a lot from Mr. Frye.  He is built more like me – heavier, bigger boned – rather than the light willowy guys that mostly make up our classes.  His methods for getting people on the ground, especially, are very much along my idea of best practice.  For instance, for the two leg takedown, he comes straight in, low, still in guard, and then basically head butts you in the gut while grabbing the top of the thighs.  With his larger mass, he doesn’t have to screw around with all of the footwork of the jujitsu method.  Just knocks you down.

Anyway, that was three hours of opening up the top of my head and pouring as much in as I could.  The man knows a lot about fighting.  It wasn’t a very strenuous seminar, actually, though we all did get banged up a lot.  I have two huge bruises on my pecs from Frye demonstrating the head butt on my chest.

Strangely, Monday I didn’t feel too bad.  I guess we didn’t really DO that much, except train on a few of the techniques.  No randori, no drills, really.

Tuesday I went back to Ninjutsu, and had a much liter class under Bryan-sensei.  Did some chokes, drills, pretty laid back.  Good thing, because I went from there to Systema with Steve-sensei, and that was an experience.

Systema is a Russian martial art based on the standup from Sambo.  It has four tenets: breathing, relaxation, movement and posture.  There are no techniques, no kata.  You just use the basic philosophy and do whatever doesn’t hurt. 

Fascinating where that takes you.  Because a lot of akidoka study Systema, a lot of the finishing moves from Systema look like Aikido.  I have 6 years of Aikido, and three more of Ninjutsu (which are all Budo) so I fit right in.  Certainly will be looking more into Systema.

Columbus Architecture Group


I had a good time with the Columbus Architecture Group (ColArc) Tuesday night at the ICC conference center.  I gave the Economics of Cloud Computing talk there and it was well received.

I got some great commentary from Mark Freeman about the impact that CompuServe had on early internetworking, which is a very good point.  CompuServe was born out of TSO, with a large organization reselling unused computer time.  This is very similar to the IBM TSO concept, and what Google, Microsoft, Amazon and the other large players are doing now.

Another point was the impact of grid computing, which I need to research a little more.

One of the big impacts was security, though.  How is cloud going to interact with HIPPA?  how do you convince a CIO?  What else has to happen to prepare your application for the insecurity of the cloud?

Location is a problem too. How about a state’s requirement to keep all data inside its borders?  There are tough questions there!

Anyway, thanks for having me folks, and I hope to see you next month.

Parameterizing web load tests

You’ve been asked to make sure that the Client Search screen will stand up under load, because it will be most used screen in the application.  You set up a test user, and then run the Web Performance Test wizard in visual Studio 2010 to record the test.  You make a new test project, save it in TFS and add a new Web Performance Test.  the browser launches, and you log in as your test user.  You do a few client searches representative of the use of the system and log off.

Next, you create a Load Test.  The wizard launches, and you prescribe a 50 user test over a half hour.  You save the test, launch it and go to lunch.

When you come back, there are 23,856 errors.

What happened?  Oh, that’s right, one user can’t log into this system more than once – it was an early requirement.  Oh!  How am I going to do this then?  Do I have to record 50 Web Performance Tests? No.  You can parameterize the login.

Making a data source

Start with a CSV file of usernames and passwords.  You can make it in NotePad or Excel.


Next, we will need a datasource that points here.  Open your webest and click on the Add Data Source icon up in the test’s button bar.


You can select CSV file as the source of the data.


Pick the excel file you created and you’ll see the sample data.


The new data source will show up as one of the data sources for this test.  Probably shoulda named it something better, huh?


Binding the fields

The next major step is to bind the data to the fiends in question.  This is insanely easy.  find the step where you enter the data.  In my case, it was the Login.aspx page.  Open the Form Post Parameters folder and find the parameters for the login and password.  In this app, they are pretty easy to find.


Then follow these steps to get to the next image:

Open the properties panel with F4

Click on the fiend in your application that has the user name.  Mine is txtUser_Name

Click on the Value parameter in the properties panel.

Click the dropdown arrow.

Open the data source in the treeview.

Open the testdata table (or whatever you named it)


Click on UserName and there you go.  Teat field is now bound to the value.


This will work for any form

Remember, this isn’t just for login.  Actually, I am making a mashup of performance testing and training, adding the training data to the system using the performance test for the New Item pages.  Load testing isn’t just for performance anymore!

Family Game Night should be back

I just spent the evening playing board games with my four year old son.  For a lot of people this would be an exercise in boredom, but it shouldn't be.  Teaching games is something that is very similar to teaching the kind of thinking that makes software design work.  It’s important, logical thinking.

Board games with young children doesn’t have to be limited to chutes and ladders and Candyland – random games with zero strategy.  Kids need to LEARN strategy.  The only way they will learn is to be led, hand in hand, though the process of making game decisions.  For instance, tonight Adam and I played Living Labyrinth . He can’t quite read the cards, and he has a hard time making decisions about how to use the cards.  But how else will he learn?


We played open hand, and I walked him through every move.  I reminded him to play his card first then move, and point blank told him what moves to make and why.  It wasn’t competitive, but it was a blast, and Adam learned a ton.  I’m betting that next time we play he’ll remember the cards and be able to make some decisions about his card use.

After that, we played a much less sophisticated game, Guess Who? This game is a deduction game similar to the old logic puzzles with the grid that we all did in the puzzle magazines.  The kicker here – Adam beat me five out of five games.  I can’t explain it, unless it is just that he is a good guesser.  We play fair and square, no help, no hints, and he has to sound out the name of the mystery person for his final guess.  Beat my pants off.

Next time I am introducing him to Kids of Catan .



This remarkable game will not only be a great rule learning adventure, but the pieces are cool and we can make up our own games – another important skill.

Plus, I can have him play against Jeff Blankenburg next year at CodeMash.

Speaking schedule looking good this year


I have been shooting for one talk a month, and things are looking great for that goal these days.

  • In January, I gave my Economics of Cloud Computing talk at CodeMash.
  • This month I gave Deviant Ollem’s Lockpicking talk,
  • In March I’ll be giving the Economics of Cloud Computing talk to ColArc,
  • In April I’ll be giving my new Software Modeling with ASCII talk to the IEEE and ACM.
  • I have a talk lined up for May, I think.
  • I have talks submitted for TechEd, DevLink and CodeStock so that should give me lots to do for the summer if anyone accepts me.

Then I just have the fall to worry about.  Pretty good year so far!

Modeling and Design

Being in the design phase of a large-scale project for the State of Ohio, I have been giving a lot of thought to the problems of Modeling and Design.

You see, I am in an interesting position on most projects – I am an actual Solutions Architect.  In today’s IT environment the term ‘architect’ has grown to mean “good coder with big mouth” and I don’t really fit that mode.  I am not the best coder in the room most of the time – although the size of my mouth could be debated.

What I excel at is the designing of solutions, which leads me to use the term ‘software designer’ to describe myself.  This isn’t terribly good either, as it leads people to believe that I produce pretty graphic design and UX, which isn’t my specialty either.

What I am, in truth, is a Software Modeler.  I am the person that figures out what the domain model looks like based on the requirements of the system and the realities of the project.  The one who stands at the end of the room and scribbles boxes on the whiteboard?  Yeah, that’s me.

Thinking about this has really helped me to hone down the definition of SQL Modeling in my head.  People ask me “Why is Oslo hooked to SQL Server?  Is it just another data language?  Don’t we have enough of these?  I thought M was a language not a database thingie!  Oh woe is me!”  Well, the key is in the difference between design and modeling, and how tightly coupled the model and the database really are.

Software modeling begins and ends with data.  The user interface is, I hate to point out, now the purview of the business analyst.  When I get a functional specification for a project, it has the UI in it right there.  I don’t have to ‘design’ that any more.  All I have to figure out is how to do it.  This effort is modeling, and it is about data.

Today I needed to estimate an inventory system with a few quirks.  Without thinking, I got out my notepad, sketched an ER diagram, and worked from that.  How many POCOs?  How many adapters?  How big is the entity model?  Yes, eventually I had to estimate how long it would take me to build the screens, but the bulk of the work was in the domain model, where the work is done, as it should be.

Livescribe Notebook Page 48

Imagine if I could have instead described my model in text, without all of the brittleness that the ER diagram entails, and have that text description emit both the databases and the POCOs for me?  And then, if something changes, I could change my scratch model and my other code changes?  What if it didn’t break my changes or additions?  What if it didn’t create scores of stored procedures that are impossible to manage? 

Wouldn’t that be nice.

This, though, is the goal of SQL Modeling.  It is clear as day exactly where it fits in the product lifecycle – right there overtop of my fancy notebook scrawls.  Those boxes and chicken feet have a place somewhere, but it’s time for a modeling solution that actually is a solution.

Bill Sempf

Husband. Father. Pentester. Secure software composer. Brewer. Lockpicker. Ninja. Insurrectionist. Lumberjack. All words that have been used to describe me recently. I help people write more secure software.



profile for Bill Sempf on Stack Exchange, a network of free, community-driven Q&A sites