← Home About Archive Photos Projects Uses Also on Micro.blog
  • Happened to wake up ten minutes before the Nobel Peace Prize announcement. Perdictions: Dona Trump will not win, the United States will declare war on Norway.

    → 4:53 AM, Oct 10
  • Copy git hashes

    I’ve been reaching more and more for git history commands to get details about the file I’m working on. I used to use tools like GitHub desktop or Sublime Merge but I never felt like they added that much value, it’s just faster to to call up a git log somefile or git log -L 5,10:somefile. The only shortcoming of this approach is it generally leaves me wanting a commit hash in my clipboard (often to switch to or to run git diff with). No more! Today I doubled down grabbing these hashes without having to mouse over and select the hash; I give you: git diff myfile --pretty=oneline | head -c7 | pbcopy This is the most simple form of this that I can find.

    --pretty=oneline ensures the commit hash is first, piped into head -c7 we get the first 7 characters of the hash (you could grab more or use some kind of regex to get the whole thing but I believe 7 is the minimal amount you can give git where it will reliably find a commit). Pipe it to pbcopy and you got a little git hash.
    It’s a fair amount of typing, I think I could set --pretty=oneline in my git config and frankly I could likely alias this whole thing as some kind of function in my .zshconfig but for now it is what it is.

    → 9:01 PM, Oct 9
  • Pretty funny that Obsidian makes you type out the “Quit without saving” Vim command to enable Vim mode. I love it but this is the first knowledge test I’ve taken to toggle a setting.

    → 3:11 PM, Oct 6
  • “Abstract” is the academic term for TLDR.

    → 10:18 AM, Sep 24
  • Doubling down on my Vim usage lately, getting to know some of the dot motions really feels like a superpower. I hate to say I’ve drunk the Kool-aid but I think I’m getting there.

    → 10:04 AM, Sep 24
  • My daughter found two monarch caterpillars on our milkweed plants which has kicked off an amazing journey.

    Its wild how much they eat and also how fast they grow.

    → 10:09 AM, Sep 18
  • Finished reading: A Different Kind of Power by Jacinda Ardern 📚

    I liked this book a lot, and I like Ardern a lot. I know she’s not overly popular in New Zealand, but all the negative review’s I’ve read center on mask mandates and required vaccinations which I have no issues with. I’d be curious to hear from come Kiwis on about this.

    → 6:23 AM, Sep 15
  • Simplicity in Editors

    Two things I’ve read/seen recently which have served as a huge inspiration to me:

    Mitch Hashimoto talking about his austere editor setup. And yobibyte talking about his plugin free neovim setup.

    These talks inspired me to make the full jump over to Neovim. Admittedly, I’m using LSPs and Treesitter a few other plugins but it’s a big step back from the full featured setup of Zed or Sublime that I was running before. I’m still in the “this is hard” phase but I think that moving a bit slower and focusing on each line as opposed to all the jumping around I did before will, ultimately, give me a deeper understanding of the codebase and the discipline to keep bigger mental models in my head.

    → 9:42 AM, Sep 10
  • Apparently Netanyahu is telling residents of Gaza to “leave now”. Sorry, but where the hell are they supposed to go? They’ve been blocked from leaving for 20 years.

    → 12:54 PM, Sep 8
  • I work from home, in the basement; my 1 year old cries whenever I go downstairs. My wife has to stand with her at the top of the stairs and they wave me down as I go. It feels like leaving for the office but like 10 times a day.

    → 9:03 AM, Sep 8
  • Is learning Vim actually faster if you then spend multiple hours a week tweaking your Neovim config?

    → 6:40 AM, Sep 8
  • From the morning walk.

    Traffic cone bobbing just below the surface of the water

    → 8:32 AM, Sep 6
  • Finally took the plunge and decided to start paying for Micro.one… Very good chance I will upgrade to the full service soon.

    → 3:53 PM, Sep 4
  • Before sending that email to hundreds of thosuands of customers… ask yourself, is announcing “we now have dark mode” broadcast worthy?

    → 8:00 PM, Jun 30
  • The NYC Primary

    It’s been a rough couple of weeks in world news. A lot has been going on that I’ve felt moved to comment on but haven’t had the heart to actually write it down. Zohran Mamdani’s victory in the NYC primary is a ray of sunshine in otherwise very dark times. It’s a powerful reminder that progressives can win even against massive entrenched interests. In the final weeks of the race billionaires and powerful centrist democrats such as Bill Clinton were pouring millions of dollars and coveted endorsements, respectively, into Cuomo’s campaign in what amounted to an attack on Mamdani. The attack failed. The voice of the people could not be silenced. Big as New York is, on the scale of everything else going on in the world this is kind of small potatoes, but a win for progressives anywhere is a victory for progressives everywhere. I’ll take it.

    → 8:00 PM, Jun 24
  • The Underground Railroad

    I’ve been on a Colson Whitehead tear in the past year having started five of his books, finishing four of them. This year I raced through the Ray Carney series (Harlem Shuffle and Crook Manifesto) and I just finished The Underground Railroad. While not my favourite of his books The Underground Railroad was still a compelling read. Whitehead has this talent that I struggle to explain. He’s very good at writing historical fiction that makes you sad or angry at the history without feeling sad or angry with the story. That’s what buoys up books like The Underground Railroad; it was a fantastic read, I daresay a borderline fun read but it also served as a poignant reminder of the atrocities of chattel slavery to the point of being physically moved. This is undoubtedly a hard balance to strike but Whitehead has managed to do it in nearly every book I’ve written.

    → 8:00 PM, Jun 23
  • White Fragility

    Full Title: White Fragility: Why It’s So Hard for White People to Talk About Racism Although it’s a short read this book was dense. That’s not to say it was a difficult read; quite the contrary, it was extremely approachable but every single page was so laden with facts each paragraph served as an essay unto itself. White Fragility asks left-leaning progressively minded folks to examine their own attitudes towards race; are we more concerned with being racist or being perceived as racist? Do we only think of racists as “very bad people” the kind who form lynch mobs or march with tiki torches? Or are we able to see how our own race has given us an unfair advantage? Are we able to see how we silently perpetuate racial disparities to suit our own needs? Do we do this in subtle subconscious ways or more overtly by proclaiming that we are “colour blind” and therefore race doesn’t matter?

    Not only did White Fragility implicate me in my own racism, it also gave me e pause to reflect other areas in which I have blindspots. Benefiting from the various privileges I have, not just as a consequence of my race but also my gender, sexual identity, appearance etc. What things have I said or done over the years that uphold and reenforce the patriarchy? Am I excluding disabled people in my actions (a very salient question for somebody who designs and builds websites, I reckon this site is not fully WCAG compliant).

    Definitely worth a read, likely a second in a few years.

    → 8:00 PM, Jun 15
  • Weekly Round Up: June 13, 2025 👻

    It was a week of state machines. Two separate Rails projects, two separate state machine libraries (state_machines and aasm), both sending emails. One is a fairly straightforward project for a department of education, it’s an old codebase but Objective built it and has been working on it ever since. As such, it’s fairly clean and straightforward for it’s age. I think that the more contractors and firms a codebase passes through the more muddled it gets. I’ve been working on this codebase for about two years now. The entire time I’ve been working to convert an old paper process to a digital one, it’s not an overly ambitious project but the budgeting has necessitated a slower pace of development. With only a few months left in the yearly budget (in education I guess the fiscal year ends with the school year) I was asked to quickly implement a form that allows users to draft a custom email message and attach a PDF. It’s been a while since I’ve done this with Rails, my last experience doing so was in the Paperclip days and that was not too fun. I’ve been pleasantly surprised with ActiveStorage, it’s much more plug-and-play then I recall (I’ve also been developing a lot longer now).

    The other project is far more involved, my new full-time at gig at Built. It’s been exciting to work in tandem with another developer who has been handling the front-end work. Coming from a small agency I’ve always developed features full stack. Part of why I wanted to switch to a dedicated product team was to have experiences like this one where a greater degree of planning and coordination between developers was required. I started by creating a model last week and writing as many tests as I thought would be relevant. I’ve been through TDD phases in the past; but I think in small teams and projects TDD offers diminishing returns. It makes a lot of sense in a scenario like this, even on a fairly small team, since I’m developing features that I can’t be able to test in the browser until the other developer has her features in place. She in turn won’t be able to know if the front end works until my code is merged into her branch. This feature was the bulk of my week but it came together in time for some Friday afternoon QA of which I’m sure there will be several things to fix on Monday morning.

    → 8:00 PM, Jun 12
  • Multi-tenancy with Phoenix and Elixir

    There are lots of good places to start with multi-tenancy in Elixir (although I’d recommend Ecto’s own docs for either foreign keys or postgres schemas ). Most of the write-ups and tutorials start the same way “generate a new Phoenix application with mix phx.new “. While, this is great if your starting an enterprise SASS app from scratch but it leaves something to be desired if you, like I was, are migrating an existing codebase with thousands of users, and products to a multi tenant application. I recently went through this with an enterprise client and there were enough pitfalls and interesting problems to solve that it seemed to warrant a detailed post.

    I believe the solution I put together is both effective and elegant but it is not without it’s pain points. Mainly, if you are going to use PostgreSQL schemas (which I did) you are going to have to migrate your existing data into said prefixes. There is no easy way around this, it’s just a slog you have to do; more on that later.

    Schemas?

    I went back and forth for a while, I finally settled on query prefixes as they felt a little more elegant; segmenting data without having to add new foreign keys to any columns. It also makes it easy to migrate or wipe customer data if needed. Admittedly, if your managing tens of thousands of tenants in a single database this approach will be a bottleneck. In my case that was not a concern; there are two current tenants and the client only expects to add a few tenants ever year if that. As mentioned, Ecto has great docs on setting up schemas; however I opted to use a dependency called Triplex mostly for the sake of time (about a week in I realized I could have rewritten most the required features in a day or two but we had about a month to make this transition so a refactor at this point seemed like overkill). Schemas work because we are using PostgreSQL, you can kind of hack together “schemas” with MySQL but under the veil it’s just separate databases, I can’t vouch for that approach because my Elixir projects are mostly in Postgres.

    The first big hurdle is ensuring that your queries are run in the the right schema. By default Ecto is going to run queries in the public schema. On any given query you can change this by passing in a prefix: option, ie: Repo.one(query, prefix: "some_prefix"). Now rewriting hundreds or thousands of Repo actions with a variable prefix is not exactly convenient but it’s imperative to ensure queries are scoped to the correct schema. Just imagine the catastrophic breach if you had Customer A getting back Customer B’s data!

    Thankfully you do not have to rewrite all your queries explicitly calling a prefix. There are some handy built-in behaviours from Ecto.Repo. Enter Repo hooks! Ecto.Repo comes with some great behaviours that allow one to effectively write Repo.one(query, prefix: "some_prefix") without actually writing it for every single query! You can implement prepare_query/3 which to filter and modify the prefix. You add these hooks to YourApp.Repo This is prepare_query/3 in it’s simplest form:

    @impl true 
    def prepare_query(_operation, query, opts) do 
    	opts = Keyword.put(opts, :prefix, "some_prefix")
    	{query, opts}
    end
    

    Now all queries will be looking at the some_prefix prefix rather than the public prefix. In our app we had a few tables that we wanted scoped to the public query? For example you may have an admins table, or possibly oban_jobs , tenants , etc. You can handle this in a few ways:

    @impl true 
    def prepare_query(_operation, query, opts) do 
    	if opts[:skip_prefix] do 
    		{query, opts}
    	else 
    		opts = Keyword.put(opts, :prefix, "some_prefix")
    		{query, opts}
    	end 
    end
    

    This works although it necessitates passing skip_prefix: true to all your Repo calls; likely fewer then before but still kind of defeating the purpose of prepare_query/3 .

    @sources ~w[admins oban_jobs oban_peers customer_pricing]
    
    @impl true 
    def prepare_query(_operation, %Ecto.Query{from: %{source: {source, _}}} = query, opts) when source in @sources do 
    	{query, opts}
    end 
    
    def prepare_query(_operation, query, opts) do 
    ... 
    end
    

    By pattern matching on your allowed tables you can bypass your prefix override. I used a combination of both of the above approaches with a list of allowed source tables as well as the option to skip_prefix which adds an manual override to the API. In theory you shouldn’t need it but you never know, tests, edge cases, shrugs…

    Tenant Selection

    At this point we’ve converted every query in the application to use a dynamic prefix in about 10 lines of code. Not bad but it’s also not dynamic, I’ve hard coded some_prefix into my queries. Before we make the actual hook dynamic we need to determine how Phoenix is going to recognize the tenant. There are many ways of doing this, in my case, for now, we are using subdomains.

    Since the subdomain is available on the conn.host, I set up a plug to fetch the subdomain:

    defmodule MyApp.TenantPlug 
    ...
    
    def selct_organization_from_domain(conn, _opts) do 
    	subdomain =  get_subdomain(conn) 
    	put_session(conn, :tenant, subdomain)
    end
    
    defp get_subdomain(%{host: host}) do 
    	[subdomain | _] = String.split(host, ".")
    	subdomain
    end
    

    This gets the subdomain and puts it in the session (which is not strictly necessary but is nice to have). Next lets pass it to Repo; as with the queries, one need not rewrite all Repo calls passing in a :subdomain option, here Elixir/Phoenix has your back. In Phoenix, each browser session is a unique process and that process can pass data to itself. Back in Repo I added these little helpers:

    @tenant_key {__MODULE__, :tenant}
    
    def put_tenant_subdomain(subdomain) do 
    	Process.put(@tennat_key, subdomain)
    end	
    
    def get_tenant_subdomain do 
    	Process.get(@tenant_key)
    end
    

    Now back in the TennatPlug we can add the subdomain to the process:

    def selct_organization_from_domain(conn, _opts) do 
    	subdomain =  get_subdomain(conn)
    	Repo.put_tenant_subdomain(subdomain) 
    	put_session(conn, :tenant, subdomain)
    end
    

    A second Repo behaviour can be used to pass options to the Repo call: default_options/1 . Rather than explicitly writing opts = Keyword.put(opts, :prefix, "some_prefix") in the prepare_query/3 hook default_options/1 will set up your opts before the Repo function runs. From there we call get_tenant_subdomain/0 to retrieve the subdomain/query prefix we set in the plug:

    @impl true 
    def default_options(_operation) do 
    	[prefix: get_tenant_subdomain()]
    end 
    
    @tenant_key {__MODULE__, :tenant_subdomain}
    def get_tenant_subdomain, do: Process.get(@tenant_key)
    

    Like prepare_query/3 , default_options/1 will run with every query.

    With this implemented, navigating to a specific subdomain will set the tenant in the current process (as well as in the session) and any database queries in that session will be scoped to the tenant’s schema. Putting it all together we have something like this in repo.ex


    @allowed_sources ~w[oban_jobs tenants]
    
      @impl true
      def default_options(_operation) do
        [prefix: get_tenant_subdomain.get()]
      end
    
      @impl true
      def prepare_query(_operation, %Ecto.Query{from: %{source: {source, _}}} = query, opts)
          when source in @allowed_sources do
        opts = Keyword.put(opts, :prefix, "public")
        {query, opts}
      end
    
      def prepare_query(_operation, query, opts) do 
      	if opts[:skip_prefix] do 
    		{query, opts}
    	else 
    		opts = Keyword.put(opts, :prefix, "some_prefix")
    		{query, opts}
    	end 
      end 
    
      @tenant_key {__MODULE__, :tenant}
    
      def put_tenant_subdomain(subdomain) do 
    	   Process.put(@tennat_key, subdomain)
      end	
    
      def get_tenant_subdomain do 
    	   Process.get(@tenant_key)
      end
    

    The simplified version of my tenant_selection_plug.ex looks like:

      def selct_organization_from_domain(conn, _opts) do 
    	   subdomain =  get_subdomain(conn)
    	   Repo.put_tenant_subdomain(subdomain) 
    	   put_session(conn, :tenant, subdomain)
      end
    
      defp get_subdomain(%{host: host}) do 
       	[subdomain | _] = String.split(host, ".")
    	  subdomain
      end
    end
    

    In production we are handling a lot more such as authorization with Guardian but this show how simple it is to get a subdomain and add it to the session. The above is a fairly bare-bones approach our final project had a lot more customization and ended up being organized a bit differently; for example, we extracted functions dealing with getting and setting @tenant_keys in the process to their own module. My hope is that the above lays the groundwork for anyone looking to do something similar.

    Data Migration

    I wish I had a solution half as slick as Ecto’s behaviours make querying database schemas. I was unable to find an elegant way to migrate relevant data to specific schemas so I was forced to do it with good old SQL.

    -- compy customers
    INSERT INTO salt_lake.locations SELECT * FROM public.locations WHERE id = 'salt_lake_location_id';
    
    -- copy customers 
    INSERT INTO salt_lake.customers SELECT * FROM public.customers WHERE location_id = 'salt_lake_location_id';
    

    I had about 50 queries similar to this. Fortunately, tenants were mapped to locations and at the time of the migration the client only had two tenants (the system was migrating from a product business to a consulting business). I ran these queries twice replacing salt_lake with bakersfield on the second iteration. In my case due to the way the system was originally designed to work with an external system (look’en at you Quickbooks) and some changes the customer was making to how that system would be used this migration ended up being a bit more harry than expected. I had to write several ad-hoc queries that looked less like the above and more like:

    INSERT INTO salt_lake.qb_orders SELECT qb.* FROM qb_orders qb JOIN orders o ON o.qb_order_id = qb.id JOIN customers c on o.customer_id = c.id WHERE NOT EXISTS (SELECT 1 FROM salt_lake.qb_orders slcqb WHERE slcqb.id = qb.id) AND c.name ILIKE '%A Problematic Customer%'
    

    Again, that’s not the fault of the multi-tenancy setup, migrating data in any complex system is always going to have it’s prickly bits. If anyone has ideas for a more elegant migration pattern (first two queries, ignore the last one that an unfortunate specific), I’m all ears, shoot me an email self[at]travisfantina.com.

    → 8:00 PM, Jun 3
  • Today I Learned ~D[2025-06-02]

    File this under “things I knew but have to look up everytime”…

    If you want to spin up a Docker container without a service like postgres , for example if you had a fully seeded DB on your machine and didn’t want to go through the hassle of copying/re-seeding in Docker, you can do so with host.docker.internal. In docker-compose.yml you can write:

    environment:
          - DB_HOST=host.docker.internal
          - DB_PORT=5432
          - DB_USERNAME=your_pg_user
          - DB_PASSWORD=your_password
          - DB_NAME=your_db
    

    Because I switch projects a lot (agency life) there are occasions where a legacy codebase just stops working (system updates, depercations, etc.) at times like these I like falling back to the Docker container (upgrading the project is not always an option) but I may not want to loose/copy all my data from when I worked on the project before. Yes, I know dev data should be ephemeral and easy to reseed but in the real world this is not always how things work!

    → 8:00 PM, Jun 1
  • The Anxious Generation

    America has a long history of moral panics, the phone, rock music, rap music, etc. I always want to be careful about blaming “that new thing the kids are doing” and therefore I try to offer a balanced perspective when somebody starts talking about “the kids and the phones”. Needless to say I went into this book with a healthy dose of skepticism.

    Right off the bat my skepticism was rewarded in the form of a lengthy anecdote about sending kids to Mars as test subjects, which Haidt uses as some kind of metaphor for the way big tech uses kids as test subjects in social media, advertising, etc. And while I agree with the point I felt the metaphor was heavy handed and it kind of commenced the book with a bad taste in my mouth, although I still kept an open mind.
    Fortunately, the Mars story was the first and last of such abstract anecdotes and the book settles into a well researched and factual deconstruction of the way children have been raised for the past 30+ years (essentially my from my childhood to now: my daughters' childhoods) and the impact that phones and social media has played on that. I was pleased that Haidt didn’t only point fingers at “the phones”. He deconstructed the “stranger danger” panics of the 80s and 90s which directly led to more sedentary inside time for kids and more fear mongering for both kids and parents.

    Nothing happens in a vacuum and Haidt got the approach right; yes social media drives young adult anxiety and depression but it it’s particularly virulent because of these devices which are in our pockets all the time. And yes we have our phones on us all the time but we also are less equipped to have real-world interactions because of parental fears about kidnappers and molesters which arose directly from 1980s culture. Yes there was a lot of fear in the 80s and 90s most of it misplaced, but there was also a new emphasis on child rearing as an active pursuit rather than a passive one. This “active” parent was more than just being loving, supportive and attentive it led to the phenomenon of “helicopter parenting” because heaven forbid kids were on their own for five minutes.

    Back to my original skepticism I think it was largely misplaced because the scope of this books was far broader than just “phones are bad”, The Anxious Generation is a survey of the evolution of children over two generations. As much as I like to push back against the panic around phones; there is a reason we don’t have a TV in our home. There is a reason we send our daughter to a Waldorf school. I acutely feel so much of what Haidt’s saying, I think too much media (social or traditional) at a young age is harmful not only as a leaver on anxiety and depression. Kids have wonderful imaginations and substituting this natural creativity with a screen of any kind can severely damage it. I see a pipeline from kids in front of the TV; to kids in school finding the textbook answer; to compliant adults who do what their told- adding value to the military industrial complex, and being good consumers. (Now who’s panicking‽)

    I volunteered for a few years at a youth group for kids roughly 6-9 it was fascinating to observe the kids who, at that young age, had phones and those who did not. The ones who had phones were almost always on them; although they were still able to engage with the group and participate in meaningful ways the phone was almost always in hand. Not only that but these children seemed older than they should have been, more worldly. They would reference jokes and memes laughing at the humour but not understanding the point because they were 9 years old not 25. Their lived experience did not match the content they were viewing and it was ageing them without making them wise.

    I read this book because I have two daughters but even for those without children it serves as a stark cultural survey. A rebuke not only for parents whose children faces are always aglow in blue light but even for adults such as myself who spend too much time doom scrolling.

    → 8:00 PM, May 28
  • Everything is Tuberculuosis

    I’ve never read anything from John Green but I used to watch his YouTube channel. As a young adult fiction author; I felt his handling of such a broad and complex topic, like the history of tuberculosis and it’s impacts on the world today would be both engaging and digestable. I was not wrong in this assumption. This book was fast even by quick read standards, I could have read another two or three volumns but I think Green was able to say what he wanted to: Tuberculuosis is not an archaic/cured disease it still impacts hundreds of millions of people who are for the most part neglected due to their poverty and or race. There is hope for the future but only if people speak up and speak out advocating both for themselves (Green points to several activists in both India and Sierra Leone) and others as we all must.

    Some quotes and ideas that stayed with me:

    “Framing illness as even involving morality seems to me to be a mistake… biology has no moral compass it does not punish the evil and reward the good, it doesn’t even know about evil and good. Stigma is a way of saying ‘you deserve to have this happen’ but implied within the stigma is also ‘and I don’t deserve it so I don’t need to worry about it happening to me’”.

    “There are many acronyms within the field of tuberculosis global health, like any field, loves to shorten its phrases to make them obvious to experts and inaccessable to neophites.

    TB has always been racially charged, first it was seen as a disease of sophisication and as such those of “darker skin” could not get it; then with industrialization it became known as a disease of the poor and has since been used, infurieatingly, as proof of white racial superiority. This is ancient history; in my life time:

    • J&J actively priced third world health systems out of bedaquiline - unconscionable!
    • In 2001 the head of USAid insinuated that HIV medicines couldn’t be distributed to Africa because: “Africans don’t know what watches and clocks are… when you say ‘take it at ten o’clock’ people will say ‘what do you mean by ten o’clock’” 🤯
    • Between the mid-1980s the mid 2000s the commingling of tuberculuosis and HIV led to more deaths than the combined fatalities of WWI and WII combined!

    Ending the post with a positive quote toward the end of the book: “Mere dispare never tells the whole human story, as much as dispare would like to insist otherwise. Hopelessness has the insdious talent of explaining everything; the reason that x or y sucks is that everything sucks. The reason your misearble is that misery is the correct response to the world as we find it and so on. I am prone to dispare so I know it’s powerful vdespairoice, it just doesn’t happen to be true. Here’s the truth as I see it; vicious cycles are common, injustice and unfairness permeate every aspect of human life, but virtous cycles are also possible.”

    → 8:00 PM, May 28
  • Today I Learned ~D[2025-05-22]

    There is only one false in Ruby… Or more broadly speaking since everything is an object, for the sake of efficency in memory management any object that can be referenced will be. Immutable primitives (such as true, false, 1, 2, etc.) will only ever reference themselves. For example:

    false.object_id
    => 0
    false.dup.object_id 
    => 0
    val = false 
    val.object_id 
    => 0
    

    Duplicating false only creates a reference to false. As opposed to a mutable primitive like a string:

    train = "choo choo"
    train.object_id 
    => 784160
    train.dup.object_id
    => 784180
    

    Of course this intuitively makes sense but I had never run up against it until I had a spec fail:

    expect(response[:enabled]).to be true
    expect(response[:value]).to be "location"
    
    => expected #<String:140300> => "location"
      got #<String:140320> => "location"
    

    I did a double take before I realized that the object_ids were different. The first spec passes because true is in immutable object. The second one fails because location is not! Fix that with: expect(response[:value]).to eq "location"

    → 8:00 PM, May 21
  • Today I Learned ~D[2025-05-14]

    I recently switched jobs, which means new BitBucket credentials. However; I remain an occasional consultant with my last agency so I need to keep my public key associated with their BitBucket account…

    The first thing I learned today

    BitBucket won’t let you use the same public key for multiple accounts. I find this a little odd; like how AWS won’t let you name a S3 bucket if the name already exists. It feels like a website telling you “hey somebody is using this password lets try something else!” I know RSA key pairs are more secure and unique than passwords but still 🤷

    Making multiple pushes to git easy

    You can adjust your ~/.ssh/config to easily push to separate git accounts with different keys:

    # Assume this is your defaut
    Host *
        UseKeychain yes 
    
    # Key 2
    Host altkey
        HostName bitbucket.org
        IdentityFile ~/.ssh/alt-key
        # you likely don't need this but it's nice to specify 
        User git 
    

    Then add/update your remote origin:

    git remote add origin  git@altkey:bitbucket_account/repo.git
    

    Instead of bitbucket.org:account you’re just subbing in the Host alias. From there SSH doesn’t care because it’s been pointed to an IdentityFile it may not be the system default but it works.

    The git problems begin

    git push and:

    fatal: Could not read from remote repository.
    
    Please make sure you have the correct access rights
    

    Ok fairly common lets go through the checklist:

    1. The key is in BitBucket
    2. BitBucket access is “write”
    3. Check origin (see above)
    4. Check permissions on the public key And that’s about where my expertise ended.

    Diving in

    It’s useful to learn a bit of debugging, you can get pretty verbose with git logging by adding the environment variableGIT_SSH_COMMAND="ssh -vvv Pretty cool, and I was able to confirm a few differences between pushes to a working repo and the broken one. I was also able to give this log to an LLM and bounce a few ideas off it but ultimtally I don’t feel like these logs gave me a lot of valuable info. git config --list likewise is a handy flag to use but it didn’t show me any glaring issues. So I started looking into the SSH config: ssh-add -l which lists the RSA keys you have configured. To be sure I did ssh-add -D which removes your keys and then explicitly added both keys back with ssh-add ~/.ssh/[key name] Then I ran ssh -T git@altkey this runs a test with the alias configured in the config file. Infuriatingly, this returned:

    authenticated via ssh key.
    
    You can use git to connect to Bitbucket. Shell access is disabled
    

    So my config was correct, I had access, but I could not push. It took me an hour but eventually I set the key for git to use explicitly:

    GIT_SSH_COMMAND="ssh -i ~/.ssh/alt-key -o IdentitiesOnly=yes" git clone git@altkey:bitbucket_account/repo.git
    

    No further issues (with either repo).
    It’s unlikelly I’ll remember specifically setting the GIT_SSH_COMMAND which is the main reason I’m writing this!

    → 8:00 PM, May 14
  • Class Configs with Lambdas in Ruby

    I’ve been getting reacquainted with Ruby, diving into a well established project which has been blessed by numerous smart developers over the course of the past 10 years. I discovered an interesting pattern for gathering models (ApplicationRecord classes) that may or may not be eligible for some feature: You start with a mixin that creates a method for your classes to pass options to; as well as a method for determining if those options enable the feature or not:

    module ProvidesFeature 
        class_methods do 
            # pass this to the model class
            def features_provided(model, **opts)
                (@features ||= []) << [model, opts]
            end
    
            # call this to initialize class feature checks
            def feature_models(ctxt)
                features_provided.map do |args|
                    DynamicFeature.new(ctxt, args)
                end
            end
        end 
    end 
    

    Here is an example DynamicFeature class instantiated above. This could be a bit simpler if you didn’t want to pass any context in but a lot of the power of this approach comes from the flexibility an argument like context gives you:

    class DyanmicFeature do 
        def initialize(ctxt, config_args)
            @ctxt = ctxt
            configure(config_args)  
        end
    
        def configure(ctxt, args = {})
            @should_provide_feature = args.fetch(:should_feature_be_provided) do 
                -> (ctxt) { ctxt&.fetch(:person_is_admin, false) }
            end
        end 
    
        def can_feature?
            @should_provide_feature.call(@ctxt)
        end
    end 
    

    Pausing for a moment and breaking this down. The #configure method is the main source of the magic. First we try to get the keyword :should_feature_be_provided (implemented below). If we get it we can return it’s value; however, there is built in flexibility to this. If args does not have a :should_feature_be_provided key then we can call a lambda with additional context. Again, you don’t need to pass anything else but I view this flexability as a strength if used strategically. Now implement; in an active record ie. Person

    class Person < ApplicationRecord 
        include ProvidesFeature 
    
        features_provided :person, 
            should_feature_be_provided: -> (ctxt) { ctxt.person.is_admin? }
        
    

    You can then easily gather any models that ProvidesFeature:

    ApplicationRecord.subclasses.select { |klass| klass < ProvidesFeature }
    

    Instantiate DynamicFeature on each class (note we are passing some context that assumes there is a person with an is_admin? method. It’s a little contrived but it illustrates the point: you can pass additional context in when the feature_models are built.

    .flat_map { |klass| klass.feature_models(ctxt) }
    

    Then filter with can_feature?

    .select { |klass| klass.can_feature? }
    

    At the start of this post I said this was an “interesting pattern”; not necessarily saying it’s a good one. I’m still fairly new to Ruby (despite having built a few production projects back in 2016 and 2018) and the OO pattern. Personally; I found the above extremely difficult to grok and even though I understand it I’ve found that, within the context of the project I’m working on, I’ve myself treadmilling through various files. In some ways I feel like, clever, as it is, this pattern may obfuscate a little too much but I’m open to feedback from those who have been in the OO world longer.

    → 8:00 PM, May 7
  • Weekly Roundup: May 2, 2025

    This week I formally transitioned from my fulltime consulting gig at Objective for a fulltime gig at Built For Teams more details on that in a future post. However; broadly speaking it means that I’m dusting off my Ruby skills, diving deeper into the realm of OO programing then I ever have before.

    Farewell ASDF

    Last Friday night I pulled a Flutter repo I’m working on with a friend. I started having all kinds of issues trying to install Cocoapods. gem install cocoapods but then flutter run produced this error:

    Warning: CocoaPods is installed but broken. Skipping pod install.
    ...
    Error: CocoaPods not installed or not in valid state.
    

    Ok. So do some more research throw in a sudo, no luck. pod version produces this error:

    <internal:/Users/travis/.asdf/installs/ruby/3.3.5/lib/ruby/3.3.0/rubygems/core_ext/kernel_require.rb>:136:in `require': linked to incompatible /Users/travis/.asdf/installs/ruby/3.1.6/lib/libruby.3.1.dylib -
    

    Ah! I’ve seen this more than once! Ever since I shifted to a Ruby focused team at the start of the year I feel like Ruby version management has been an uphill slog. I’ve reshim’d multiple times, removed versions of Ruby, removed the Ruby plugin, and reinstalled ASDF. Things work for a time but eventually I run into errors like the above. My hunch, which may be ovbious, is that something was wrong with my setup that was placing versions of Ruby inside other versions (ruby/3.3.5/lib/ruby/3.3.0); I’m not sure if the path is supposed to look like that but it doesn’t make sensee to me. I’m willing to take responsability here, it may be that my $PATH was misconfigured (although I attempted multiple times to proide a concise path for ASDF) or that something in my system was messing with ASDF. I love ASDF, it’s served me very well for years. Being able to remove rvm and nvm and seamlessly manage Elixir versions between projects was a breath of fresh air. The docs are clear and concise, the tool provides enough functionality to get stuff done without getting in the way. However; for whatever reason, the slog to get Ruby working just took its toll. One of my coworkers mentioned Mise which is a drop in replacement for ASDF. I installed it in about 30 seconds and in 45 seconds my project was running with Mise. 👏

    → 8:00 PM, May 1
  • Weekly Roundup: Apr 25, 2025

    At the agency, we have a customer who has asked that customers accept terms of service before checking out. This is for an Elixir project; mostly fullstack Elixir however the frontend has an odd assortment of sprinkles: StimulusJS and React. I created a terms_and_conditions versions table and an accompanying view helper which will check a terms_version_accepted on the user record if the last terms_and_conditions.inserted_at date matches the terms_version_accepted then the user is shown an active “proceed to checkout” button, if not the button is disabled and a note asking them to acccept the terms of service will display.
    Since most of the Elixir projects I work on are fullstack (Phoenix LiveView) I don’t often get to write API endpoints. The API work on this was admittidly very small, a simpl endpoint that takes the user’s ID and updates the terms_version_accepted timestamp when they click “accept” in the modal. It returns a URL which we then append to checkout link allowing the user to proceed. This feature is due May 5th but I’m hoping to get onto the staging server on Monday or Tuesday.

    Internal Tooling:

    I’ve been using fzf for a while but I’ve wanted to filter only unstaged files, ideally whenever I type git add I just want to see a list of unstaged files that I can add. Admittidly I got some help from AI to do write this up:

    function git_add_unstaged() {
        local files
        files=$(git diff --name-only --diff-filter=ACMR | fzf --multi --preview 'git diff --color=always -- {}')
        if [[ -n "$files" ]]; then
            BUFFER="git add $files"
            CURSOR=$#BUFFER
        fi
    }
    
    function git_add_unstaged_widget() {
        if [[ $BUFFER == 'git add' ]] && [[ $CURSOR -eq $#BUFFER ]]; then 
            git_add_unstaged 
            zle redisplay
        else 
            zle self-insert
        fi
    }
    
    zle -N git_add_unstaged_widget 
    bindkey ' ' git_add_unstaged_widget
    

    I’m wondering if I’ll find the automatic git add to be jarring or have situations such as a merge conflict where this may not work. If so I can always fiddle with the bindkey but for right now I’m enjoying my new found git add speeds.

    → 8:00 PM, Apr 24
  • Cloud Atlas

    A phenomenal read, I was thoroughly hooked into this book from 1849 to 2346. I haven’t read anything quite like this; in the hands of a less talented writer, structurally, it could have been a bit gimmicky. However; Mitchell is talented provide one compelling story after another. Initially I worried that The Pacific Journal of Adam Ewing and Sloosha’s Crossin' an' Evrythin' After were going to drag because of the verbosity or eccentricity (respectively) of the language but after a few pages I was engrossed in both.

    → 8:00 PM, Apr 22
  • Weekly Roundup: Apr 18, 2025

    Working for a small agency I am fortunate to work on a number of fast moving projects simultaneously. For years I’ve failed to document what I do during the week but I’m going a little recap of my week. One part historical record, one part general interest. I’m posting it on my blog in the off chance that somebody reads it and, facing a similar problem will reach out I’m always happy to discuss what worked for me and what didn’t work. It also doesn’t hurt to put this stuff into the world to show that yes I actually do work; I haven’t always had the most active GitHub but most of my client projects a private/propriety. I’m easing into this, all week I was looking forward to this post; now, however, I realize I should have been working on this not cramming it in from memory on a Friday night.

    This week was a balance between my ongoing Elixir projects and a newer (to me) Ruby project.

    • For the past five years I’ve either supported, or been the lead dev on a large B2B ecommerce platform which handles a few million in daily sales. Over the winter the company began consolidating their North American and European processes which includes using said platform for sales in the EU. Although the hope is that the European process will align with the North American there are some relevant differences. For example in North America the client’s product is technically considered a “raw material” which means there is no “Value Added Tax” (VAT); however in Europe, depending on the country of origin and the destination VAT may be charged, other relevant changes are shipping across borders, truck loading calculations and different invoicing procedures. At this point we are still in the research and discovery phase but I’ve been working with another developer to scope this project out and write some preliminary tests as research.
    • For another client I’ve been moving from a Quickbooks Online integration to Quickbooks Desktop, this is a multi-tenancy Elixir Phoenix app so I’ll be keeping the Online functionality and just adding a connection to Quickbooks Desktop. The API docs for QBOnline are fairly good, this is not the case with QB Desktop, it’s evident that Intuit either has the platform on life support or intentionally obfuscates the functionality to foster a consulting industry around the product. QB Desktop uses an SOAP XML type endpoint. Having wrangled fairly nasty endpoints with SAP I wanted to, if at all possible, avoid dealing directly with QB Desktop. I discovered a service called Conductor that does the bulk of the heavy lifting and allows you to hit a very concise REST endpoint.
    • Since the beginning of the year I’ve been transitioning from primarily Elixir projects at the agency to a single Ruby based product. On that front I’ve been involved in an ongoing integration with BambooHR; partnering with Bamboo to pull employee data from their endpoint.
    • On a personal front I finished the migration of this blog from Ghost back to markdown files. I still love Ghost but managing my own instance and integrating it with my Garden proved to be more management than I wanted.
    → 8:00 PM, Apr 17
  • Experience has shown that if you put out a bug bounty your server will be hit repeatedly with requests to /wp_admin for the rest of eternity.

    → 8:00 PM, Apr 15
  • Personal Heuristic: Make it Readable

    I wrote this post back in January, just dusted it off to post today as I attempt to get back on the blogging horse.


    Today I was refactoring a small module that makes calls to an SAP endpoint. The compiler got hung up because it couldn’t find the value item. It was an easy fix, my code looked like this:

    for itm <- data do
        %{"MATNR" => material, "PSTYV" => category, "VBELN" => so} = item
        %{material: material, category: category, so: so}
    end
    

    It’s easy to spot (especially if the compiler tells you exactly where it is); in the function head I wrote itm but down below I’m looking for item. Simple; yet this is not the first time something similar has happened to me. It’s also not the first time I’ve specifically confused itm with item which led me to this conclusion: just write item every time. There is an odd switch in my brain that thinks I’m penalized by the character, and leaving e out of item will somehow make my code more terse. While technically true, it’s not worth it. It never is; just write item, everytime. People know what item is. itm is more ambiguous, not just because it only saves one letter, but it could be an abbreviation or some weird naming convention. Why put that mental load on someone, even yourself, reading through this code? This is a tiny example but it’s magnified in function names. While check_preq may be quick to type and take up less horizontal space in an editor it’s not immediately clear what this function does. I would argue that get_purchase_requisition_number is a much better function name; even if you know nothing about the function, the codebase, or programming in general you can read that and know what’s supposed to happen. Of course there are conventions, ie. ! dangerous or ? bankbook method endings in Ruby ie. exitst? will throw an error. These sorts of things require one to be a little familiar with the patterns of a language but that’s ok that just means that I can write a function get_purchase_requisition_number! and anyone familiar with Ruby or Elixir will expect the function to raise or return an explicit value (as opposed to something wrapped in an :ok tuple).

    Moving forward I’m calling things what they are even if it comes with a dash of verbosity.

    → 8:00 PM, Apr 13
  • Wintering: The Power of Rest and Retreat in Difficult Time

    Reading 80% of this book was an exercise in torture. I’m always a little wary of personal memoirs cum self-help books but a few have been transformative for me (ie. Pamela Druckerman’s Bringing Up Bébé). Katherine May hooked me early with this book, the prose was sharp and the anecdotes interesting however it very quickly devolved into anecdote after anecdote from a brief period in her life where, I guess, she was forced to work less?

    This book is rife with privilege, which doesn’t always bother me but in this particular case it seems to hallow where half of of the book is dedicated to the message of “slow down, take it easy” and the other half is, “go to Iceland, trek the northern tundras of Sweden”.

    Ultimately this book fell into the same trap as Gretchen Rubin’s Happier At Home (a did not finish from last year); an extremely self absorbed upper-middle class person thinking that their experience = wisdom and is therefore worth writing an entire book.

    Zero stars. Did not finish.

    → 8:00 PM, Jan 21
  • Today I Learned ~D[2025-01-10]

    Today’s TIL has a twist ending… so stick around.

    Elixir has a shortcut for creating anonymous functions. I’ve always written:

    greet = fn name -> "Hello, #{name}!" end 
    # which can be invoked
    greet.("Travis")
    # returns
    "Hello, Travis!"
    

    However; I came across some tricky code with a case statement:

    docs = case type do 
    	:billing -> &[billing_act: &1]
        :shipping -> &[shipping_act: &1]
    end 
    
    # invocation
    type = :billing
    docs.("some customer")
    # returns 
    [billing_act: "some customer"]
    

    This was very confusing to me, the fact that the anonymous function was a case form only further obfuscated what was happening. I thought it might be some case magic.

    No. Apparently you can short cut the aforementioned anonymous function declaration:

    greet = & "Hello, #{&1}!"
    

    You treat this as any other anonymous function. You can even have multi-arity functions:

    greet = & "Hello, #{&1} #{&2}!"
    # invocation 
    greet.("Travis", "Fantina")
    # returns 
    "Hello, Travis Fantina!"
    

    In my case the case statement could have also been written:

    docs = fn customer -> 
    	case type do
    		:billing -> [billing_act: customer]
        	:shipping -> [shipping_act: customer]
    	end
    end
    

    Plot twist: This is not a TIL, apparently I learned this at least four years ago. That initial case function… the author was me four years ago!

    → 8:00 PM, Jan 9
  • RSS
  • JSON Feed
  • Micro.blog