Monday, October 17, 2011

Lets use EventSourcing on the Cloud, another Amazon project with Whale in the name

So, I'm playing with Event Sourcing, and I cooked up my own, mostly because I want to be able to explain this well to those who ask.

Let's explain in code with a really simple domain that everyone will understand -- Bank Accounts, you better have one!

How does one get a Bank Account? They open it, right? So I made this event.

Now lets make a quick domain POCO domain object for an account. The ctor will take this event, and do some cool stuff with state. Looks like this.

Two interesting things to note here. First, that I'm mutating my state in the Apply(AccountOpened) method, not just in the ctor. This is because that method will be called later when we get the Account from a repository. Yeah, it's a bit different, but honestly, to me the benefits WAY outweigh this single "drawback" to me. What benefits, you ask? Glad you did... We'll get there in a minute.
Second difference is that UncommittedEvents object... We'll also get to that in a minute. For now, lets write a quick integration test and watch WhaleEs in action. First, we bootsrap WhaleEs, like so.

I'm just parsing some text file that has my Amazon info in it here, then calling WhaleEs fluent config API so that I can get an Instance of Repository.
Next I'll create an account and save it.
Like so.

Kinda boring, huh? Let's go look at the guts of Repository, cause I know that's what you wanna see.

Just a little reflection magic here. I really don't wanna make people reference my library and ruin the POCO on their AR objects, so I just let them set the method name that contains the list of Uncommitted objects. That's pretty much the convention everyone is using for ES so I'm sure we'll be ok with that. Anyway, I'm calling that getter, then persisting the events it returns (calling the object that I blogged about last).
The get method is uses similar logic. It simply pulls the event stream for the AR with that Id, then uses reflection to call Apply(event,true) for every event, rebuilding state. It sends the extra true parameter for isReplaying, that way the AR won't add the event to it's UncommittedEvents property. Here it is.

Not to horribly complicated actually. Lets see if it works.

Sweet! Console outputs 100, just like I opened the account with.  Let's play with this a little more. Make a couple deposits and withdraws.

This works great, test passes, and that proves that we're actually rebuilding state. "So what Elliott! I could do that by just saving state, you twit!" you say.
Yes, tis true, my dear friend, you certainly could. However requirements change. Let's say they did, and know we want a list of all activity that ever occurred on the account. In good 'ole state land, we'd just cross our fingers that when we originally designed the system we had the good sense to actually save Deposit and Withdraw objects to some persistence, maybe we would, but maybe we wouldn't. Point is, working this way, even though we didn't it is very easy to add an "ActivityList" property to our Account object. Just modify the Apply methods for the appropriate events.... Like so.

Add a quick test like this.

And we see this....

Account Opened On 10/5/2011 2:35:43 PM with $1000
Deposit made on 10/5/2011 2:35:43 PM $45
Withdraw on 10/5/2011 2:35:43 PM $17

Anyway, I know there's a ton of writings on this stuff, but I hope this helps someone somewhere.
Feel free to yank the code down from github at

Have fun!

Monday, October 10, 2011

Quick and Dirty Event Source using Amazon S3 and JSON

So yeah, I'm doing it too. Writing an event store. Not really because I think that Jonathan Oliver's isn't good, just because I wanna wrap my head around it. Lemme break down Event Sourcing into as small as a nutshell as possible.

Event Sourcing is the storing of a "stream" of events that represent the history of all that has happened in a system, and using that to create state instead of simply storing the state.

If that doesn't make sense to you, well, that's not the purpose of this blog. Go google it a bit, and read some more, then come back. This particular  post is about the quick and dirty ES implementation I'm writing.  If you're too lazy for google, here's a few great links.

Here's the idea. Events are raised by Aggregate Roots, and I'll create one stream for each AR. I'll serialize the stream of events (for now using JSON) to a Amazon S3 bucket with a key of the type name of the AR. Seems pretty simple to me... Let's give it a whirl.

First, I'll create some test dummies for my AR and an Event. Like so

I think I'll make an EventSource object where T is the AR type... Here's a quick integration test - starting with actual integration because, well, because I don't wanna get bit by Amazon quirks that I'll miss in a unit test.

Not really a test, but I'll just look in S3 for the file for now... Like I said, this is quick and dirty! Let's go make this happen.  So first, we're gunna need a way to tell the serializer what types the actual events are, so we can't just dump them all in a file and persist it. I created a quick little EventEnvelope class that wraps that. Like so.
We'll just serialize a list of these to S3. Ok, lets do this.
Looks pretty simple... Run the test. No exceptions and yep, I see a file there.

The contents look like so...

Cool. Looks ok so far. Now lets make that GetEventStream method work. I'll add a test that'll just write out the TestEvent.What value that I just put up there.

Pretty simple... Now lets make it pass.

Pretty simple huh? We're just calling that ExistingEvents for method that yanks the file from s3 for that Id..
Run the test, and yep, I see "Blah" in my console.

Ok, so I'm not sure if this code will actually get used, because Jonathan Oliver did some really great ES stuf and he's working on S3/Simple DB implementations now, however, I really wanted to create a quick implementation for some POC stuff. I was surprised at the simplicity of this. Next up, lets see if I can write something to get some AR state working of a stream.

Monday, October 3, 2011

Givin' Amazon cloud services some luv... Why you should consider using Amazon for SOA needs

So the Los Techies Open Spaces Event was amazing. I have never in an environment where I was surrounded by so many people that just "got it". It was just awesome.

There was a re-occurring theme (besides FUBU and javascript) that I found myself gravitating to, it was EventSourcing and messaging patterns (surprise huh). The last session was on messaging systems, and I was surprised to kinda be the lone voice in the room mentioning Amazon services. I figured I'd make a quick blog post where I show the actual pub/sub guts of WhaleBus, and how simple it is to get messaging working on the Amazon cloud.

First of all, go sign up for AWS, it's free to sign up, and the free tier allows 100K SNS and SQS requests and up to 1K email notifications.

Done? Ok cool, let's get to coding.

Let's publish a message to the cloud real quick like.  Spin yourself up a project and get AWSSDK.dll (it's on nuget- just search for Amazon, it's the first result).

So, first, let's create a "Topic" to publish our messages to. Amazon gives us a web interface to do this, but who wants that? Let's do it programmatically...

Simple enough, I see the topic arn on my console when I run it. And I can see the new topic in my aws console.
Next up, let's publish a messages to that topic. First, let's just set up an email end point through the aws console (we can do it programatically just as well, but I wanna publish real quick like).

Create an Email Endpoint UI in AWS console

Now, amazon sends me an email to the address I specified and makes me opt in (otherwise this would be a really cool way to piss people off, huh?). I follow the link in the email. Subscriber is done.

Simple enough, right?
Now, when I publish a message to this topic, I SHOULD get a email with the message json serialized.  Let's try it out.

Run the test, and check my email, and booya!

So, there we go, I just created a topic, then published a message to that topic. However, I'm having a hard time seeing how useful getting emails of published messages is, I mean, I could just send myself an email with a subject of "Elliott has an awesome blog" and a body of "Hello from the cloud" and get the same result, right?

So, lets, now go set up a subscriber that has a little more value. SNS supports HTTP posts, but let's not do that, lets use an SQS queue.  Yeah, we can create one through their ui, but lets do it with code.
Ok, ok, lots of code there, but it's almost all security stuff that you'll only do once per sqs queue.
Everything from line 21 - 30 is simply setting the security to allow SNS to publish messages to the newly created SQS queue.

Now,  let's create a go publish that message again and write a quick little test that'll pull messages from the sqs queue. So, I run the publish_a_message test, and I see the email arrive. So I know the message made it to SNS, let's write the code to pull from the sqs queue.

All this code is doing is looping for 5 seconds and calling RecieveMessage, then writing the contents of the message to the console. Here's what I see.
So, yeah, it works, and it's simple. So I like it. Whatta ya think?

Thursday, September 29, 2011

Looking for good software engineers in Austin, Tx

 Ok, so they put a link to my blog on the corporate site, and while I've been blowing up twitter about the fact that we're hiring, I figured I'd make one little blog post about the fact that we're hiring, what we're looking for and how AWESOME this job is. Email me at if you're interested, or hit

Why is WhaleShark awesome? 

Yeah, we hire interns too!

The deal industry is going nuts. It ain't no secret that the economy has seen better days, and consumers are trying how to buy more stuff with less money. So, yeah, we're killing it economically. How do I know? Well, every Tuesday, we have an all hands meeting, and they tell us actual numbers. It's really cool to work for a company where people start wondering what's wrong, when we only exceed forecast on revenue by 4 or 5%.

Our kitchen
The technology stack is pretty neat here. By "pretty neat", I mean AWESOME!  We use Voldemort for persistence. It's a distributed key/value store that's pretty nifty. We serialize everything using Google Protocol buffers (the value part of key/value). All queries (since you can't query a key/value store easily) are Solr. We also use the heck outta Amazon servies. We use Amazon S3 to store all sorts of data, SNS and SQS for pub/sub (see my WhaleBus post). We have massive amounts of data that we use Hadoop/Pig to process to get lots of cool analytics stuff. We've got sites in classic asp, 1.1, mvc, wordpress, LAMP - pretty much everything you can think of, and we make  all work together. In short, it's pretty freekin' cool. What's even cooler, is we'll keep getting more sites, the executive team is all about acquisition, so engineering has to stay agile. That's a good thing to me.

Yeah, that's Dublin baby!
Lunch is catered (at least) every Tuesday. The fridge is always stocked with frozen lunches, Blue Bell ice cream, bagels and waffles (for us morning guys), and lots of other goodies. And YES, to the right, that is a picture of Dublin Dr. Pepper on tap that you see. The office is, AMAZING. Glass conference rooms,  desks that promote pair programming and impromptu conversations, but still large enough that you can take a personal call and not let everyone know that your kid just got sent home for flicking a bugger on the teacher.  There's small couches all over the place so you can sit and talk about how awesome CQRS or why the Cowboys aren't gunna suck this year without the need to reserve some conference room. Oh yeah, there's usually beer in the fridge on Friday, and no one gives you crap if you drink it.

My Desk (and Pratik being awesome)
The culture at WhalesShark is another thing worth mentioning. I've been around the block a couple times.   Too many times I've seen a great company loose good engineers because of some silly political issues between product, engineering and executive staff. Cotter (our CEO) is amazing about NOT allowing that kinda thing, we have an open culture like I've never seen. Most (I think all) of our full time employees have options. So, we've all got a vested interest in the success, and we all work together at it.  We work hard, but have fun too. I'm the resident nerf dart expert, and as the Team Lead for the Internal systems team, it's not rare for Angie (one of the operations managers) to come shoot me with her nerf gun because one of the feeds imported a coupon wrong. We have a LOT of nerf gun wars actually.

Board Room
Conference room (and Jamie)
Elevator Lobby

 So what types of Engineers are we looking for?
  • Passionate about their craft, someone who codes because they love it, not because it's their carrier.
  • Is willing to step out of the comfort zone, and does. Someone who learns things because they want to.
  • Someone who understands the importance of testing (I'm a TDD guy, and that's what I want, but it's not a must)
  • Someone strong in OOP and Design (I'm a big DDD guy)
  • Someone who is interested or has experience in distributed systems (CQRS is a big plus in my book)
  • An awesome engineer - I WILL go through coding exercises with you, and if you think giant case statements are ok, then, move on :)
  • A github account with some examples of stuff you've done is a big plus.
Ok... sorry for the spam! Back to coding for me! 

Monday, September 26, 2011

WhaleBus Publishing - An example of Amazon SNS and Amazon SQS working together

So, lets talk a little bit about the Guts of WhaleBus...

The problem I'm solving at WhaleShark is to communicate things across different applications, as in when something happens on, we may want to create some content on This fits naturally with Amazon Simple Notification Services. 

Amazon SNS is simply a push mechanism. You have "Topics" that endpoints subscribe to. The endpoints can be of various types (HTTP post, HTTPS post, email with json payload, and Amazon Simple Queue Serviecs).  We'll be creating a topic per type of event.

To publish an event, consumers just call EventPublisher.Publish(T @event).  The event publisher will create a SNS topic if one doesn't exist, serialize the event, then publish it to Amazon. It's actually pretty simple.

I really like this approach, because at WhaleShark, we have a quite a few non .Net websites. We can still push up events pretty much any way we want (they're just protocol buffer serialized messages) to the proper SNS topic, and all subscribers will be notified - regardless of platform.

Back to .Net though... 

For subscription, we use WhaleBusHost. It's simply an executable that pulls from SQS queues. All instances of WhaleBus require a config property to be set for ApplicationName. This application name is used to create an SQS queue. All instances of WhaleBus with the same ApplicationName will share an SQS queue. This is pretty cool, because you can throw more instances easily for scalability.

After it creates the SQS queue, it then scans the application bin directory for all classes that implement IDomainEventHandler<>. It'll then pull out the type for the event, then have the SQS queue, subscribe to that Event's SNS Topic.

Let's make a pretty picture...

The green clouds are SNS topics, Orange ones are SQS queues for specific applications, the blue boxes are instances of WhaleBusHost.exe pulling from those queues.

Note that those green clouds can also send messages via HTTP posts, I just didn't show that in this diagram.  I really think this an awesome approach to a very real problem. At WhaleShark we want to be able to quickly acquire new websites, and allow our current operations staff to use the tools they're currently using to moderate content and activity on the new sites. Instead of ripping the guts out of each of the new sites, this way we can simply write a publishing and subscribing code that publishes events our administration website will care about. The subscription code for those events will already be written. Picture anyone?

Lets say WhaleShark buys the (I think) fictional website Let's say it's a php site with a mysql database.   Well, when one of our awesome operation folks or one of or super secret automated tools finds a killer deal on bacon....

We'd definitely want to have that content to show on right?
Given a system kinda like this.

We've got a few options on how to make that happen.

Write some quick mapping code that maps from DealFound event to what is expected as a post to sharebacon.php and add it as a new HTTP endpoint for the DealFound SNS endpoint like so.

Write an implementation of IDomainEventHandler that inserts directly into BaconCoupons.db

I don't think there is a right or wrong approach here. It totally depends on the logic that exists in sharebaconcoupon.php, and the team structure (did we get a bunch of new php superheros when we acquired    I'm guessing that we'd be more inclined to number 2, since any business logic that exists (checking for duplicates, etc), would be in .Net code, and could use libraries we've already written, but that doesn't mean that we can't (or won't) use the first approach.

Ok, I've gotta quit writing blogs and start writing code. Catch everyone later....

Tuesday, September 20, 2011

Using Messaging in Real World, Legacy Apps (and intro to WhaleBus) - Amazon SNS and Amazon SQS

So, if you wanna pull down the code before you read my blog post... It's here...

Also, If you're familiar with NServiceBus, and this code looks something like it, that's not an accident, I think Udi Dahan made a awesome product, and I've used it with great success on a few projects. I've just got a few different problems at the new gig, and figured I'd give my own little service bus a shot.  To be fair, it's pretty much the Ellemy.CQRS stuff I had been blogging about before moving here, but yeah, I spent the last week making it work a lot better to solve some problems we have here.

We own coupon websites - a lot of them. The business plan is pretty much, buy them up, make them a little better (hopefully), throw a little better marketing at them, and make LOTS OF money. The state of our current systems looks something like this.

Note that,, and Howzat where all greenfield applications. They're things our internal staff wrote from the ground up. They wrote them in a traditional N-Tier approach, they all share a Voldemort  document store that we've named "Madre".

We clearly have a lot of work to do. It's not cost effective to acquire a property, then train ops employees on how to use the admin features of each website. That's how things are being handled now. We've gotta change that..

Certainly one approach is to migrate all properties to "Madre". It would "kinda" work. However, to me, that's a LOT of work. At the end of the day, ops needs to be able to use one application to update ANY website that we own. That's it. There's no requirement that all apps share a single database, just that the apps can be updated from a single application and that our execs can see the money we're making.

Well…. That sounds like a great candidate for messaging huh?

Think about it. The business logic for most (if not all) use cases already exists on every website we'd ever acquire, right? I mean, when someone shares a coupon on some php site with a mysql data base, there's already code on that site that updates the data base… Otherwise, well, we probably wouldn't be buying it.

We already use Amazon Cloud services for lots of stuff… Simple Notification Service is PERFECT for this. It supports a few different types of endpoints, including HTTPS post. So, our admin interface pushes some message (a domain event) out to SNS, and we configure SNS to have an HTTPS endpoint to some page on our site that takes the post. Here's a pretty picture I just drew using google documents.

This way, the only thing new we need to write on RetailMeNot is some HTTP endpoint that translates the CouponSharedOnSite message to call whatever code ALREADY EXISTS on RetailMeNot. Pretty simple, and makes the acquisition of new properties MUCH much easier (something that makes our CEO happy - and having a happy CEO is a good thing - trust me).

Ok, so, that's awesome and all, but we've got a finance department, they like to say things like "Hey, we're 8% up in revenue over projection" and they wanna know if they're actually telling the truth. The picture above is awesome to get customers on RetailMeNot to see some coupon that one of our ops person created on the Howzat application, but what about when finance wants to know that someone actually clicked on one of our coupons?

Well, we're working on a really nice database, that for the purposes of this blog, we're gonna say is totally complete. All we need to do is update that database when someone clicks on a link that goes to a vendor. Note that we want this to happen for ALL of our properties, and also note that our CEO love buying new websites. Also note that this is how pretty much EVERY coupon site makes revenue so they all record these outclicks in some way. All we need to do is add code to whatever exists on those properties that records the outclick to send a message out to SNS, and we'll write an app that subscribes to these messages.
I'll draw you another picture, cause I think google drawings are cool.

Ok, cool pictures and everything right? But no one comes to my blog for pictures… Let's talk code.


For now, I'm gonna just go over how to use my new little pet project, we'll dive deep into the code in some future blog.
First of all, all the messages we'll be sending are DomainEvents, not commands. This is because commands handling is (for the most part) application specific. If you don't understand this, well, read that link for DomainEvents. In a nutshell, DomainEvents communicate something that has ALREADY HAPPENED that other systems may care about, like "Customer Clicked On A Deal" or "Some deal just was just expired".

When an event happens, applications want to do things based on that event. Let's express that in code.

namespace WhaleBus
public interface IDomainEvent{}
public interface IDomainEventHandler<in TDomainEvent> where TDomainEvent : IDomainEvent
void Handle(TDomainEvent @event);
So, if we wanted to add a row to some db table when a customer clicked on a deal, we'd simply write the code in a class implementing IDomainEventHandler<CusomterClickedOnDeal>. I'll show some of this in a future blog, but for now, I'm gonna assume that you can figure that out.

I had two real use cases in this blog, one where we where publishing an event, and one to subscribe to it.

Here's the code you'd use to publish events….

var publisher = EventPublisherFactory.CreateEventPublisher();

You'd probably do this during app startup and make your publisher a singleton since you only need one instance of him for the app. We can talk more about that later. This code tells WhaleBus to use GoogleProtocolBuffers for it's serialization mechanism (you can also use Json), and what assembly contains the events. If you've got events spread over multiple assemblies, just keep calling EventsAreInAssemblyContainingType... It'll add them for you. To publish code you simply call Publish<TEvent>(TEvent @event) on publisher. WhaleBus will do all sorts of cool stuff for you that will discuss in a future blog. 

Here's the code that's in the Example on github for a publisher.

using System;
using Events.Stores;
using WhaleBus.Implementations;
using WhaleBus.Infrastructure;
using WhaleBus.Infrastructure.Publisher;

namespace WhaleBus.PublisherConsole
    class Program
        static void Main(string[] args)
                .StructureMapBuilder(new PubSubSettingsFromFile(@"c:\specialsupersecret\"))
            var publisher = EventPublisherFactory.CreateEventPublisher();

                Console.WriteLine("Enter Store Name");
                var name = Console.ReadLine();
                Console.WriteLine("Enter Domain");
                var domain = Console.ReadLine();
                publisher.Publish(new StoreAddedToSite{Domain = domain, Name= name});
                Console.WriteLine("Published Event.");

Now lets show a subscriber. We handle subscribers a bit differently. For now, I've written a console app called WhaleBusHost. To use it, you simply reference it, and write a class that implements the IConfigureThisSubscriber class. Here's the one that's in the example on github.

using Events.Stores;
using WhaleBus;
using WhaleBus.Implementations;
using WhaleBus.Infrastructure;
using WhaleBusHost;

namespace AnEndpoint
    public class ConfigureMe : IConfigureThisSubscriber
        public ConfigureMe()
                .StructureMapBuilder(new PubSubSettingsFromFile(@"C:\SpecialSuperSecret\"))


That's really it. WhaleBus wires up all your subscriptions for any event that has a handler in every assembly in the applications bin folder. So as you add new functionality (handlers), you don't have to manually do much of anything. Just run WhaleBusHost.exe.

We'll discuss how everything works on the next post.

Saturday, April 16, 2011

Inactive for a bit

Sorry to everyone for not blogging for a while. Started a new gig and haven't had as much personal time.

I do plan on continuing work with Ellemy.CQRS and have some pretty cool that I will be blogging about in the near future.

Also, I DO have a email subscription to my my Amazon account and I checked in some code that let people use it. Well, some of the messages being pushed out to the cloud looked suspiciously like someone thought they where a really cool hacker. They weren't and whoever you are, since I'm not using SQL, attempted SQL injection attacks sent to an event bus that doesn't use SQL probably won't do anything except make me laugh when I read the email that I get for every message published.

I went ahead and changed my Access Key so. So sorry folks, if you wanna run the tests, simply sign up for an amazon account at and use use your account.

I'll catch up with everyone soon!

Happy Coding!

Wednesday, February 23, 2011

Size does matter, Crunching data for the cloud with a Google Protocol Buffer Serializer

So, most cloud services charge by some unit of size (Amazon charges by the Gig). If you’re working for a company that uses these services, then, well, duhh….send around the smallest bits of data you can, right?

The way I’ve got Ellemy stuff for serialization now is that I have one little interface.

   1:  public interface ISerializer
   2:      {
   3:          object Deserialize(string input, Type desiredType);
   4:          object DeserializeObject(string input);
   5:          string Serialize(object input);
   6:      }

I hate the DeserializeObject thing, but, well, that’s for another time.

I have one concrete implementation of it currently, it uses JSON. I won’t show it here, because that’s not the point of the post.

So, I’ve been looking at Google Protocol Buffers and the .Net project for it, so I figured I’d write a ISerializer that uses protocol buffers. It looks pretty nice, apparently about half the size of Json serialization and a bit quicker too. The not so nice part is that the data is not self descriptive, and you MUST know the type of the object being requested. Honestly though, that’s not a problem for most apps I write, Just send the type in the message, and we’re good right?

So taking a look at the Protobuf-net wiki, I see that we’ll need data contracts for our messages.

Given a class like so.

   1:   public class SomeThingToSerialize
   2:      {
   3:          public string SomeStringProperty { get; set; }
   4:          public Guid SomeGuidProperty { get; set; }
   5:          public int SomeIntProperty { get; set; }
   6:      }

We’d need a class like this so that Protocol Buffer can do its thing.

   1:  [ProtoContract]
   2:      public class SomeThingToSerialize
   3:      {
   4:          [ProtoMember(1)]
   5:          public string SomeStringProperty { get; set; }
   6:          [ProtoMember(2)]
   7:          public Guid SomeGuidProperty { get; set; }
   8:          [ProtoMember(3)]
   9:          public int SomeIntProperty { get; set; }
  10:      }

I don’t want to muddy my problem domains with sterilization specific attributes, so I think what I’ll do is actually generate the classes for the messages on the fly – let’s play in CodeDom a bit!


   1:   [TestFixture]
   2:      public class ProtocolBufferGenerator_tests
   3:      {
   4:          private ProtocolBufferDataContractGenerator _generator;
   5:          [SetUp]
   6:          public void arrange()
   7:          {
   8:              _generator = new ProtocolBufferDataContractGenerator();
   9:          }
  10:          [Test]
  11:          public void message_is_decorated_with_ProtoContract_attribute()
  12:          {
  13:              var random = new Random();
  14:              var testThing = new SomeThingToSerialize
  15:                                  {
  16:                                      SomeGuidProperty = Guid.NewGuid(),
  17:                                      SomeIntProperty = random.Next(),
  18:                                      SomeStringProperty = "Blah"
  19:                                  };
  20:              var protoClass = _generator.GenerateProtoFor(testThing);
  21:              var foundProtoContractAttribute = protoClass.GetType()
  22:                  .GetCustomAttributes(false)
  23:                  .Any(attribute => attribute.GetType().Name == "ProtoContractAttribute");
  25:              Assert.IsTrue(foundProtoContractAttribute);
  26:          }
  27:      }

Yeah, just what the test says, it just checks that the object this ProtocolBufferDataContractGenerator is decorated with the correct attribute. Lets make it so.

   1:  public class ProtocolBufferDataContractGenerator
   2:      {
   3:          private const string _codeContractsNamespace = "Ellemy.CQRS.Serializers.GoogleProtocolBuffers.Contracts";
   6:          public object GenerateProtoFor<T>(T thing)
   7:          {
   8:              var nameSpace = new CodeNamespace(_codeContractsNamespace);
   9:              nameSpace.Imports.Add(new CodeNamespaceImport("ProtoBuf"));
  10:              nameSpace.Imports.Add(new CodeNamespaceImport(thing.GetType().Namespace));
  11:              var @class = new CodeTypeDeclaration(thing.GetType().Name)
  12:                               {
  13:                                   IsClass = true,
  14:                                   Attributes = MemberAttributes.Public
  15:                               };
  16:              var protoContractAttribute = new CodeAttributeDeclaration("ProtoContract");
  17:              @class.CustomAttributes.Add(protoContractAttribute);
  18:              nameSpace.Types.Add(@class);
  19:              var compileUnit = new CodeCompileUnit();
  20:              compileUnit.Namespaces.Add(nameSpace);
  21:              compileUnit.ReferencedAssemblies.Add("protobuf-net.dll");
  22:              var thingAssembly = thing.GetType().Assembly;
  23:              var assemblyToAdd = thingAssembly.GetName().Name + ".dll";
  24:              compileUnit.ReferencedAssemblies.Add(assemblyToAdd);
  25:              var parameters = new CompilerParameters {GenerateInMemory = true};
  27:              var provider = new CSharpCodeProvider();
  28:              var results = provider.CompileAssemblyFromDom(parameters,compileUnit);
  29:              if(results.Errors.Count != 0)
  30:              {
  31:                  throw new InvalidOperationException(results.Errors[0].ErrorText);
  32:              }
  34:              return
  35:                  results.CompiledAssembly.CreateInstance(string.Format("{0}.{1}", _codeContractsNamespace,thing.GetType().Name));
  38:          }
  41:          }

Lots of code there, but it does work! If you’re not familiar with code dom, it’s actually not very complicated. We’re just using C# to gen c#, and the code is fairly self documenting.

Pretty neat that we’re leaving the assembly in memory, but I’m thinking that in the future, we might wanna save that assembly, and not pay the cost of genning that class multiple times, but we’ll get to that soon enough.

So far, we’re creating an empty class decorated with the ProtoContract attribute, which is perfectly worthless since we don’t have the properties. Let’s fix that.

First, make sure we’re adding properties on the message.

   1:   [Test]
   2:          public void properties_are_added()
   3:          {
   4:              var expectedNumberOfProperties = typeof(SomeThingToSerialize).GetProperties().Count();
   5:              var actualNumberOfProperties = _protoClass.GetType().GetProperties().Count();
   6:              Assert.AreEqual(expectedNumberOfProperties,actualNumberOfProperties);
   7:          }

To make it pass I wrote this little method and called it on line 19 (actually doesn’t matter where) from the GenerateProto for method.

   1:   private void AddProperties<T>(T thing, CodeTypeDeclaration @class)
   2:          {
   3:              foreach (var propertyInfo in thing.GetType().GetProperties().OrderBy(p => p.Name))
   4:              {
   5:                  var field = new CodeMemberField
   6:                                  {
   7:                                      Type = new CodeTypeReference(propertyInfo.PropertyType.FullName),
   8:                                      Attributes = MemberAttributes.Private,
   9:                                      Name = "_" + propertyInfo.Name
  10:                                  };
  11:                  @class.Members.Add(field);
  12:                  var @property = new CodeMemberProperty
  13:                                      {
  14:                                          Name = propertyInfo.Name,
  15:                                          HasGet = true,
  16:                                          HasSet = true,
  17:                                          Type = new CodeTypeReference(propertyInfo.PropertyType.FullName),
  18:                                          Attributes = MemberAttributes.Public,
  19:                                      };
  20:                  var getter = new CodeSnippetStatement(String.Format("return _{0};",propertyInfo.Name));
  21:                  @property.GetStatements.Add(getter);
  22:                  var setter = new CodeSnippetStatement(String.Format("_{0} = value;", propertyInfo.Name));
  23:                  @property.SetStatements.Add(setter);
  24:                  @class.Members.Add(@property);
  25:              }
  26:          }

Ok, so now we need to decorate each property with a ProtoMember attribute. Here’s the test.

   1:  [Test]
   2:          public void every_property_on_the_message_is_decorated_with_a_ProtoMember_attribute()
   3:          {
   5:              foreach(var propertyInfo in _protoClass.GetType().GetProperties())
   6:              {
   7:                  var foundProtoMemberAttribute = _protoClass.GetType().GetProperty(propertyInfo.Name)
   8:                 .GetCustomAttributes(false)
   9:                 .Any(attribute => attribute.GetType().Name == "ProtoMemberAttribute");
  10:                  Assert.IsTrue(foundProtoMemberAttribute);
  12:              }
  13:          }

And now lets make it pass. I wrote this little method and added it as in that loop in AddProperties.

   1:   private void AddProtoMemberAttribute(CodeMemberProperty property, int memberNumber)
   2:          {
   3:              var protoBuffAttribute = new CodeAttributeDeclaration("ProtoMember");
   4:              var attributeArgument = new CodeAttributeArgument(new CodePrimitiveExpression(memberNumber));
   5:              protoBuffAttribute.Arguments.Add(attributeArgument);
   6:              @property.CustomAttributes.Add(protoBuffAttribute);
   7:          }

Sweet, now we’re generating the DataContracts! We’re still not actually serializing objects though. We’re just making stuff that Google Protocol Buffers can work with.

Lets add a new test.

   1:   [TestFixture]
   2:      public class using_the_GoogleProtocolBuffer_serializer
   3:      {
   4:          private Serializer _serializer;
   5:          [SetUp]
   6:          public void Arrange()
   7:          {
   8:              _serializer = new Serializer();
   9:          }
  11:          [Test]
  12:          public void serialize_an_non_DataContract_class()
  13:          {
  14:              var testThing = new TestThing { Guid = Guid.NewGuid(), Int = 1, String = "Some String"};
  15:              var output = _serializer.Serialize(testThing);
  16:              Assert.IsNotNullOrEmpty(output);
  17:              Console.WriteLine(output);
  18:          }

Ok, so not much going on here, we’re just testing that the output is actually not null, and (by extension), that the Serializer class doesn’t throw an exception.

Here’s what I did to make it work.


   1:   public class Serializer : ISerializer 
   2:      {
   3:          private readonly ProtocolBufferDataContractGenerator _protocolBufferDataContractGenerator;
   5:          public Serializer()
   6:          {
   7:              _protocolBufferDataContractGenerator = new ProtocolBufferDataContractGenerator();
   8:          }
   9:          public object Deserialize(string input, Type desiredType)
  10:          {throw new NotImplementedException("patience is a virtue"); }
  12:          public object DeserializeObject(string input)
  13:          {
  14:              throw new NotSupportedException("nope, dis dont werk ");
  15:          }
  16:          public string Serialize(object input)
  17:          {
  18:              var t = _protocolBufferDataContractGenerator.GenerateProtoFor(input);
  19:              foreach (var property in input.GetType().GetProperties())
  20:              {
  21:                  var setterForT = t.GetType().GetProperty(property.Name);
  22:                  var value = property.GetValue(input, null);
  23:                  setterForT.SetValue(t, value,null);
  24:              }
  25:              string data;
  26:              using (var writer = new MemoryStream())
  27:              {
  28:                  ProtoBuf.Serializer.NonGeneric.Serialize(writer, t);
  29:                  writer.Position = 0;
  30:                  using (var reader = new StreamReader(writer,Encoding.ASCII))
  31:                  {
  32:                      data = reader.ReadToEnd();
  33:                  }
  34:              }
  35:              return data;
  36:          }
  37:      }
  38:  }

on line 18, I simply get an instance of the DataContract class (we just saw what’s in there). I then loop through all the types on the DataContract via reflection and set all the values appropriately.

On line 26-34, we’re just using Protobuff-net to serialize the object, and return the results.

Not horribly complicated, but we’re still not done, because we can’t yet deserialize.

New test.

   1:   [Test]
   2:          public void serialize_then_deserialize()
   3:          {
   4:              var testThing = new TestThing { Guid = Guid.NewGuid(), Int = 1, String = "Some String" };
   5:              var output = _serializer.Serialize(testThing);
   6:              var result = (TestThing)_serializer.Deserialize(output, typeof(TestThing));
   7:              Assert.AreEqual(testThing.Guid, result.Guid);
   8:              Assert.AreEqual(testThing.String, result.String);
   9:              Assert.AreEqual(testThing.Int, result.Int);
  10:              Assert.AreEqual(testThing.Enum1, result.Enum1);
  11:          }

Yeah, now we’re testing that it actually works. Lets make this pass.

   1:   public object Deserialize(string input, Type desiredType)
   2:          {
   3:              var bytes = ASCIIEncoding.ASCII.GetBytes(input);
   4:              var @event = Activator.CreateInstance(desiredType);
   5:              using (var stream = new MemoryStream(bytes))
   6:              {
   7:                  var thisSucksINeedToFixIt = Activator.CreateInstance(desiredType);
   8:                  var protobufferType = _protocolBufferDataContractGenerator.GenerateProtoFor(thisSucksINeedToFixIt).GetType();
   9:                  var protobuffer = ProtoBuf.Serializer.NonGeneric.Deserialize(protobufferType, stream);
  11:                  foreach (var fieldInfo in protobuffer.GetType().GetProperties())
  12:                  {
  13:                      var setter = desiredType.GetProperty(fieldInfo.Name);
  14:                      var value = fieldInfo.GetValue(protobuffer,null);
  15:                      setter.SetValue(@event, value, null);
  16:                  }
  17:              }
  18:              return @event;
  19:          }

Not too bad, huh?

Except for that thisSucksINeedToFixIt variable. I should defer to some IOC container or something there, but honestly, I don’t think events should have dependencies – yeah, argument for a different day.

Ok, so the test passes except for the Guid assertion. Gunna go take a look at that now.

Ok, after a lot of googling I realized that the issue was my ASCII encoding. I doubt anyone reading this blog cares too much, but what I did to fix it was to use BitConverter.ToString() method when Serializing and some code I ripped off off stack overflow to deserialize. If you’re interested, yeah go git the code!

So there we go, I like it, but there’s still a lot of work to go before it’s ready for prime time.

Some things I have to get done.

  • Handle versioning well (the order is very important, if you added a new property with the current implementation in the middle of a class, it would break old stuff)

  • Maybe even output .proto files, and have the GoogleProtocolBufferGenerator use them if they exist, that way it’ll work by default with conventions, but allow customization, and it’ll save on the overhead of generating the contract.

Have fun with it! Thoughts?

Wednesday, February 16, 2011

What goes up must come down, Events from the cloud with the Amazon SQS Ellemy.CQRS subscriber

Ok, so I’ve got a nice little Amazon SNS publisher working for Ellemy.CQRS. So I figured I wanted to work some more with Amazon services and write a subscriber. Amazon makes a Simple Queue Service that was build to work seamlessly with their SNS, so I figured, why fight it?

Lets get to work. In keeping with the spirit of the other examples I’ve written for Ellemy.CQRS, I’m going to make this really simple. In fact I’m gunna write it to behave the same way as the NServiceBus Subscriber does. 

Here’s the user story…

As a weird person who looks at console windows, I want to see a display on my screen that shows the message text and message id every time a message is submitted on the website so that I can clap my hands and do the funky chicken.

For now, I’m just gunna put the subscription stuff in the Amazon Publishing assembly (just like I did for NService bus). I don’t really see a reason not to, and since I’m the boss of this product, what I say goes. 

Since I’ve done a lotta leg work in my last post, I’m familiar with what we’re gunna need a subscriber to do.

  1. Create a Queue if it doesn’t exist
  2. Set an Attribute on the new Queue that gives the Topic permission to send messages to it
  3. Subscribe to topics from the Topic ARN

So, lets get started. I wrote this little test.

   1:    [SetUp]
   2:          public void Arrange()
   3:          {
   4:              _config = Configure.With().AmazonPublisher();
   5:              _config
   6:                  .AwsAccessKeyId(awsKey)
   7:                  .AwsSecretKey(secret)
   8:                  .TopicArn(topicArn)
   9:                  .QueueName(tempQueueName);
  11:              _amazonClient = AWSClientFactory.CreateAmazonSQSClient(awsKey, secret);
  12:          }
  13:          [TearDown]
  14:          public void CleanUp()
  15:          {
  16:              var listQueuesRequest = new ListQueuesRequest().WithQueueNamePrefix("TESTING");
  17:              var result = _amazonClient.ListQueues(listQueuesRequest);
  18:              foreach (var queueUrl in result.ListQueuesResult.QueueUrl)
  19:              {
  20:                  _amazonClient.DeleteQueue(new DeleteQueueRequest().WithQueueUrl(queueUrl));
  21:              }
  24:          }
  25:          [Test]
  26:          public void it_will_create_the_queue()
  27:          {
  28:              //lets just leave the logic in the ctor, it'll be singleton for consumers
  29:              var subscriber = new AmazonSqsSubscriber(_config);
  31:              var listQueuesResponse = _amazonClient.ListQueues(new ListQueuesRequest());
  32:              var foundExpectedQueue = false;
  33:              foreach (var queueUrl in listQueuesResponse.ListQueuesResult.QueueUrl)
  34:              {
  35:                  if (queueUrl.Contains(tempQueueName))
  36:                      foundExpectedQueue = true;
  37:              }
  38:              Assert.IsTrue(foundExpectedQueue,"Queue was not found");
  42:          }

Basically, we just spin up the Subscriber and use Amazon’s SDK to check that it did create the queue, then we kill the queue in a teardown. Note that Amazon gets all pissy if you try and create a queue with the same name as one you just deleted within 60 seconds, I got around that by (you guessed it!) waiting 60 seconds before I ran the test again. I guess we can make a unique name for the queue and using that, maybe I’ll do that….

Lets make this puppy pass. Here’s what I did.

   1:  public class AmazonSqsSubscriber
   2:      {
   3:          private readonly AmazonConfig _config;
   4:          private AmazonSQS _client;
   6:          public AmazonSqsSubscriber(AmazonConfig config)
   7:          {
   8:              _config = config;
   9:              SetupQueue();
  10:          }
  12:          private void SetupQueue()
  13:          {
  14:              if (!String.IsNullOrEmpty(_config.SqsQueueUrl)) return;
  15:              _client = Amazon.AWSClientFactory.CreateAmazonSQSClient(_config.AccessKeyId, _config.SecretKey);
  16:              var listQueuesResponse = _client.ListQueues(new ListQueuesRequest {QueueNamePrefix = _config.SqsQueueName});
  17:              string queueUrl;
  18:              if(listQueuesResponse.ListQueuesResult.QueueUrl.Count == 0)
  19:              {
  20:                  var response = _client.CreateQueue(new CreateQueueRequest{QueueName = _config.SqsQueueName});
  21:                  queueUrl = response.CreateQueueResult.QueueUrl;
  22:              }
  23:              else
  24:              {
  25:                  queueUrl = listQueuesResponse.ListQueuesResult.QueueUrl[0];
  26:              }
  27:              _config.SqsQueueUrl = queueUrl;
  28:          }
  29:  }


Kinda ugly, but yeah, that just made the test pass. I can’t see much I can do to clean it up except a little resharper extract method stuff, I won’t bore you with what I did.

Next up, lets make that subscriber actually receive messages. Amazon gives you a “RecieveMessage” call you can make, so we’ll just set up a little loop that’ll just continually call that, and dump out the results. Actually, lets just write the “test”, it’s not really a test, since there’s no assertion, I just wanna see stuff get written to my console.

   1:   [Test]
   2:          public void it_will_parse_messages()
   3:          {
   4:              var subscriber = new AmazonSqsSubscriber(_config);
   7:              var messageRequest = new SendMessageRequest()
   8:                  .WithMessageBody("This is a test")
   9:                  .WithQueueUrl(_config.SqsQueueUrl);
  10:              _amazonClient.SendMessage(messageRequest);
  12:              subscriber.Start();
  14:              subscriber.Stop();
  16:          }

When I run this, I just wanna see that “This is a test” dumped out in my console, I guess I could make it so we inject some interface for Console, but I’m too lazy to do that right now. Let’s go make this puppy pass.

Here’s what I added.

   1:   [ThreadStatic]
   2:          private bool _stopped;
   3:          public void Stop()
   4:          {
   5:              Console.WriteLine("Stopping....");
   6:              _stopped = true;
   7:          }
   8:          public void Start()
   9:          {
  10:              ThreadPool.QueueUserWorkItem(delegate { BeginWork(); });
  11:          }
  12:          private void BeginWork()
  13:          {
  14:              while(!_stopped)
  15:              {
  16:                  ThreadPool.QueueUserWorkItem(delegate { DoWork(); });
  17:              }
  18:          }
  19:          private void DoWork()
  20:          {
  21:              var response = _client.ReceiveMessage(new ReceiveMessageRequest { QueueUrl = _config.SqsQueueUrl });
  22:              if (!response.IsSetReceiveMessageResult())
  23:              {
  24:                  return;
  25:              }
  26:              var messageResult = response.ReceiveMessageResult;
  27:              foreach (var message in messageResult.Message)
  28:              {
  29:                  Console.WriteLine(message.Body);
  30:              }
  31:          }
  32:      }

Run the test, and here’s what I see.


Looks good, although, I KNOW there’s a bug, because Amazon leaves the message up there. I’ll write another test that demonstrates the bug, but since I was too lazy to inject some “Console writer” thing that I could do assertions on, I’ll just have to use my eyes to see that it’s “failing”.

   1:    [Test]
   2:          public void the_message_is_only_processed_once()
   3:          {
   4:              var subscriber = new AmazonSqsSubscriber(_config);
   7:              var messageRequest = new SendMessageRequest()
   8:                  .WithMessageBody("This is a test")
   9:                  .WithQueueUrl(_config.SqsQueueUrl);
  10:              _amazonClient.SendMessage(messageRequest);
  12:              subscriber.Start();
  14:              Thread.Sleep(15000);
  15:              subscriber.Stop();
  16:          }

Ok… so when I run that… For some reason, it does exactly what I expect. I think it’s got something to do with Amazon putting a lock on the message or something. Regardless I know (from Amazon Documentation) that I need to manually delete the message after I’ve processed it. So, whatever, I’m gunna go do that. Here’s what I did.

   1:  private void DoWork()
   2:          {
   3:              var response = _client.ReceiveMessage(new ReceiveMessageRequest { QueueUrl = _config.SqsQueueUrl });
   4:              if (!response.IsSetReceiveMessageResult())
   5:              {
   6:                  return;
   7:              }
   8:              var messageResult = response.ReceiveMessageResult;
   9:              foreach (var message in messageResult.Message)
  10:              {
  11:                  Console.WriteLine(message.Body);
  12:                  Delete(message);
  13:              }
  14:          }
  16:          private void Delete(Message message)
  17:          {
  18:              var deleteRequest = new DeleteMessageRequest()
  19:                  .WithReceiptHandle(message.ReceiptHandle)
  20:                  .WithQueueUrl(_config.SqsQueueUrl);
  21:              _client.DeleteMessage(deleteRequest);
  22:          }


Everything looks like it’s working.

Ok, technically, we’ve met the requirements of the AC, but we don’t wanna couple up that Console.WriteLine stuff, we wanna make this thing work so we can make Domain Event handlers handle the events that are sent in messages, I mean that was the whole point of this, right?

First of all, we’re not yet subscribing to a topic, lets rip off the code from my last post to do that.

Wow…. This made it ugly! But it’s working for now, we’ll clean up in a bit.

   1:    private void SetupQueue()
   2:          {
   3:              if (TheQueueIsAlreadySet()) return;
   4:              var queueUrl = GetOrCreateQueueUrl();
   5:              _config.SqsQueueUrl = queueUrl;
   6:              SubscribeToTopic();
   7:          }
   9:          private void SubscribeToTopic()
  10:          {
  11:              SetPermissions();
  12:              var getArnRequest = new GetQueueAttributesRequest().WithQueueUrl(_config.SqsQueueUrl).WithAttributeName("QueueArn");
  13:              var clientArn = _client.GetQueueAttributes(getArnRequest).GetQueueAttributesResult.Attribute[0].Value;
  15:              var sns = Amazon.AWSClientFactory.CreateAmazonSNSClient(_config.AccessKeyId,
  16:                                                                      _config.SecretKey);
  18:              var subscriptionRequest = new SubscribeRequest()
  19:                  .WithEndpoint(clientArn)
  20:                  .WithProtocol("sqs")
  21:                  .WithTopicArn(_config.TopicAccessResourceName);
  23:              sns.Subscribe(subscriptionRequest);
  24:          }
  26:          private void SetPermissions()
  27:          {
  28:              var setQueueAttributeRequest = new SetQueueAttributesRequest()
  29:                 .WithQueueUrl(_config.SqsQueueUrl)
  30:                 .WithAttribute(new Attribute { Name = "Policy", Value = AllowSnsAttribute() });
  31:              _client.SetQueueAttributes(setQueueAttributeRequest);
  32:          }

You’ll recognize a lot of this code from my last blog, so it’s not new. I really need to move out the concept of subscribing in it’s own little class, but I’ll get to that soon enough.

Changed my test to look like this.

   1:  [Test]
   2:          public void the_message_is_only_processed_once()
   3:          {
   4:              var subscriber = new AmazonSqsSubscriber(_config);
   6:              var @event = new TestEvent {SomeGuid = Guid.NewGuid(), SomeInt = 1, SomeString = "Some String"};
   7:              _publisher.Publish(@event);
   9:              subscriber.Start();
  10:             //Ugly, but I wanna make sure the message arrives
  11:              Thread.Sleep(3000);
  13:              subscriber.Stop();
  15:          }


here’s what I see.


Sweet! Now events published (line 7 in the above code) actually do make it to my subscriber. Next, we’re gunna actually deserialize that Message property to get the actual Domain Event instead of just doing a Console.Write(message.Body).

Ok, first things first, we’ve gotta be able to load the type. I added a cute little method to allow consumers to specify what assembly (or assemblies) the Events they’ll be publishing live in to my AmazonConfig class like so.

   1:   public AmazonConfig EventsAreInAssemblyContainingType<TEvent>()
   2:          {
   3:              if (EventAssemblies == null)
   4:                  EventAssemblies = new List<Assembly>();
   5:              EventAssemblies.Add(typeof(TEvent).Assembly);
   6:              return this;
   7:          }


Just call that method in my test, and changed my AmazonSubscriber to look like this. Bear with the Ugly for now, I’ll be cleaning it, the point of this isn’t to show how clean I can make code, it’s to show that I can get it to work. Pull down the code from git, you’ll see that I clean my stuff up darnit! Anyway, here’s the (very ugly) code.

   1:  private void DoWork()
   2:          {
   3:              var response = _client.ReceiveMessage(new ReceiveMessageRequest { QueueUrl = _config.SqsQueueUrl });
   4:              if (!response.IsSetReceiveMessageResult())
   5:              {
   6:                  return;
   7:              }
   8:              var messageResult = response.ReceiveMessageResult;
   9:              foreach (var message in messageResult.Message)
  10:              {
  11:                  var serializer = new JavaScriptSerializer();
  12:                  var thing = (Dictionary<String,Object>)serializer.DeserializeObject(message.Body);
  13:                  Type eventType = null;
  14:                  foreach (var eventAssembly in _config.EventAssemblies)
  15:                  {
  16:                      var t = eventAssembly.GetType((string)thing["Subject"]);
  17:                      if(t!=null)
  18:                      {
  19:                          eventType = t;
  20:                          break;
  21:                      }
  22:                  }
  23:                  if (eventType == null)
  24:                      throw new ConfigurationException(
  25:                          String.Format(
  26:                              "Could not load type {0}, please make sure you call AmazonConfig.EventsAreInAssemblyContainingType<TEvent>() during bootstrap.",
  27:                              thing["Subject"]));
  29:                  var @event = serializer.Deserialize((string)thing["Message"], eventType);
  30:                  Console.WriteLine(@event);
  31:                  Delete(message);
  32:              }
  33:          }


Still doing that Console.WriteLine, of course, but yeah, now when I run the test, I’m getting “AmazonTests.TestEvent” instead of that Json string. Now, lets just change that Console.WriteLine to instead locate all the event handlers for the event and execute them.

Once again, shut up about ugly, this is RED/GREEN/CLEAN land, clean is last, I wanna make it work before the code looks pretty.

I changed that line 30 from above to this.

   1:    var @event = serializer.Deserialize((string)thing["Message"], eventType);
   2:                  var handlerInterface = typeof(IDomainEventHandler<>).MakeGenericType(eventType);
   3:                  foreach (var handler in _config.EllemyConfiguration.ObjectBuilder.BuildAll(handlerInterface))
   4:                  {
   5:                      var handlerMethod = handler.GetType().GetMethod("Handle");
   6:                      handlerMethod.Invoke(handler, new [] { @event });
   7:                  }
   8:                  Delete(message);


I’ll then go write a quick hander write in the test assembly that’ll do the Console.WriteLine, just so I can still see all that pretty stuff. Like this.

   1:   public class TestEvent : IDomainEvent
   2:      {
   3:          public Guid SomeGuid { get; set; }
   4:          public String SomeString { get; set; }
   5:          public int SomeInt { get; set; }
   6:      }
   7:      public class ConsoleWriter : IDomainEventHandler<TestEvent>{
   8:          public void Handle(TestEvent @event)
   9:          {
  10:              Console.WriteLine("SomeGuid: \t{0}",@event.SomeGuid);
  11:              Console.WriteLine("SomeInt: \t {0}",@event.SomeInt);
  12:              Console.WriteLine("SomeString: \t{0}",@event.SomeString);
  13:          }
  14:      }


To make it all work, I changed the setup method in my test to look like this.

   1:    [SetUp]
   2:          public void Arrange()
   3:          {
   5:              tempQueueName = "Test_QUEUE";
   6:              _config = Configure.With()
   7:                  .StructureMapBuilder()
   8:                  .HandlersAreInAssemblyContainingType<TestEvent>()
   9:                  .AmazonPublisher();
  11:              _config
  12:                  .AwsAccessKeyId(awsKey)
  13:                  .AwsSecretKey(secret)
  14:                  .TopicArn(topicArn)
  15:                  .QueueName(tempQueueName)
  16:                  .EventsAreInAssemblyContainingType<TestEvent>();
  18:              _publisher = new AmazonPublisher(_config);
  20:              _amazonClient = AWSClientFactory.CreateAmazonSQSClient(awsKey, secret);
  22:          }

So, note that we’re (in line 6) telling it to use my StructureMapBuilder, and telling it where the DomainEventHandlers are (line 8).

Lets give the test a run.



Lots of stuff going on there, and I’ve got a lot of cleaning to do, but it’s working! Freel free to pull down the code from github. If you’d like to contribute, I’m game too!

Have fun with it all, and I hope this is helpful.