1. Accessing WebPacked TypeScript from Javascript

    While I was trying to incorporate Stripe’s Javascript library into my codebase, I ran into some issues implementing it in Typescript. I eventually was able to fix those issues, but while I was trying to figure that out, I left the Stripe code as Javascript and stumbled upon an interesting way to interface that with my Typescript code compiled by WebPack.

    First, at the bottom of the Typescript file you want to call from Javascript:

    module.exports = {
        methodName: methodName,
    }
    

    In webpack.common.js:

      output: {
        ...
        library: 'MyExposedNamespace'
      }
    

    Finally, in the Javascript file:

    MyExposedNamespace.methodName();
    
  2. AWS presigned URLS in Django

    I found the documentation for presigned URLS on AWS using boto to be insufficient, so here is how I formed the request.

    I wanted to have a file private in S3, but expose it to a user for download if they were authorized to access it. I found the easiest way to do that was leave the object in S3 as private, but use the AWS API to generate a pre-signed URL with a low timeout (a minute or so). Unfortunately, I found the documentation not so straightforward and cobbled together a few Stack Overflow answers to come up with exactly what I needed. In particular, I needed to add the ‘ResponseContentType’ to get the file to download correctly, and needed to specify my AWS credentials.

    I’m using Django to serve up the download via a get request:

    class GetDownloadURL(APIView):
    
        def get(self, request):
            # Get the service client.
            session = boto3.session.Session(profile_name="AWSUserName")
            s3 = session.client("s3")
    
            # Generate the URL to get 'key-name' from 'bucket-name'
            url = s3.generate_presigned_url(
                ClientMethod="get_object",
                Params={
                    "Bucket": "your-s3-bucket",
                    "Key": "SampleDLZip.zip",
                    "ResponseContentType": "application/zip",
                },
                ExpiresIn=100,
            )
    
            return Response(url)
    

    Notes: You will need to update the Key and Bucket params to match what you have in S3. Depending how you have set up your AWS credentials, you may be able to omit the ‘profile_name="AWSUserName"’ parameter. I prefer to be explicit in my config because I’ve run into issues when I have used the default in the past.

  3. Jekyll Setup for PML

    I originally created this blog using an EC2 instance on AWS and used that to run Jekyll and host the site. Recently, I shut down that instance in order to move the site generation to my local machine(s) and host the site on S3. This article was very helpful to me when setting up HTTPS with S3

    For now, my Jekyll setup is pretty basic, and I’ve only made a handful of changes:

    -I created a pages folder to hold the few non-blog-post pages (currently just About, Books, and Projects) to clean up the home directory. I accidentally moved the homepage into the pages folder at one point and was very annoyed and confused at my homepage not updating when I’d rebuild with changes. Consider yourself warned!

    -Instead of building using the command line, I created a simple bash script to build the site then send the changes to S3:

    #!/bin/bash
    
    #set destination and source so that we can run from anywhere
    #helpful when editing posts in the _posts directory
    JEKYLL_ENV=production bundle exec jekyll build -d /mnt/c/blog_location/_site -s /mnt/c/blog_location
    
    #remove this script from the site directory.
    rm /mnt/c/blog_location/build-blog.sh
    
    #upload to s3
    aws s3 sync --delete --size-only /mnt/c/blog_location//_site/ s3://programmingmylife.com/ --profile MyProfile
    

    The file paths are /mnt/c/ because I typically build the site from my local Windows Subsystem for Linux install.

    -Added google analytics. I tried following a blog post to do this, but it broke the page (I must have mistyped something on the includes), but it made me realize jekyll now has GA built in! You just need to add the line: ‘google-analytics: UA-11111111-1 ‘ to your _config.yml (where UA-11111111-1 is your GA tracking ID) and set JEKYLL_ENV=production before building.

    -I got rid of categories in the URLs, which is the default. This was as easy as adding ‘permalink: /:year/:month/:day/:title.html’ to _config.yml

    -I also wanted to add the full contents of posts to my home page. I couldn’t find a config setting to change this, so I had to edit the minima theme. First, I had to find the theme files with

    bundle show minima
    

    Then, I edited _layouts/home.html to include all post content as shown here. I may change this as the number of posts increases.

    I’d like to improve the styling on the blog or get a different theme at some point, but for now, it is mostly the current (at the time of posting) default for Jekyll.

  4. Setting up Webpack to Modify URL per Environment in TypeScript

    A website I am currently creating has a very simple front end, but I wanted to be able to swap out instances of my API URLs in TypeScript depending which environment I am working in or building for. So far they are all ‘http://127.0.0.1:8000/’, but when I deploy, I didn’t want to have to remember to set a URL for prod or staging etc.

    Unfortunately, I was unable to find a good way to do this with just the TypeScript compiler for static files. I’m not a big fan of the complexity that webpack adds, but I’ve noticed it also minifies and obfuscates my javascript, which is a nice bonus. If you know of a good way to do this with the TypeScript compiler alone, please let me know!

    I tried a few different methods, but a combination of using webpack and splitting my config into development and production and DefinePlugin worked. Otherwise, I mostly just followed the installation instructions for webpack including installing locally. I tried installing globally based on other tutorials, but I ran into a bunch of issues doing that.

    Also, I’m a little frustrated with how confusing the documentation was on this. I tried using Environment Variables in Webpack, but I couldn’t figure out how to access those in my code (not just the config file). It also took me some searching to understand how to use DefinePlugin because the documentation does not make it clear where that should go or how to include it. I found these two links that helped me figure it out (see my webpack.dev.js, webpack.staging.js, and webpack.prod.js files below).

    With this setup, I run ‘npm run build:test’, ‘npm run build:staging’, or ‘npm run build:prod’ depending if I am working locally or building for production. Those commands are mapped in package.json:

    {
      "name": "av_frontend",
      "version": "1.0.0",
      "description": "",
      "private": true,
      "scripts": {
        "test": "echo \"Error: no test specified\" && exit 1",
        "build:test": "webpack --config webpack.dev.js --watch ",
        "build:staging": "webpack --config webpack.staging.js",
        "build:prod": "webpack --config webpack.prod.js"
      },
      "author": "",
      "license": "ISC",
      "dependencies": {
        "@types/bootstrap": "^4.1.2",
        "@types/node": "^10.12.12",
        "bootstrap": "^4.1.3"
      },
      "devDependencies": {
        "ts-loader": "^5.3.1",
        "typescript": "^3.2.2",
        "webpack": "^4.27.1",
        "webpack-cli": "^3.1.2",
        "webpack-merge": "^4.1.4"
      }
    }
    

    I added --watch on test so that when I’m developing in VSCode, I can just leave it running and it’ll update whenever I save a file. If I want to run that manually I have to run ‘node_modules/.bin/webpack --config webpack.dev.js --watch’ because I installed webpack locally, not globally.

    When I first got this set up, I realized it was only outputting a single Javascript file because I didn’t understand how ‘entry’ and ‘output’ worked in the webpack config file (see code below). Now I define each entry JS file for each page (they both have an import for my config with the IP addresses) with a name. Under ‘output’ [name].js corresponds with each ‘entry’. So my output ends up in the dist/ folder and the two files are named ‘login.js’ and ‘purchase.js’ based on the two fields in the ‘entry’ object. Any files added to 'entry' will produce a corresponding output Javascript file.

    I also missed something in the instructions for Typescript when I was initially setting this up that lead to a long hunt for why it wasn’t including my config.ts file (the error messages were not great). Don’t forget the ‘resolve’ field below or webpack will get confused when trying to import any file with a .ts extension referenced in another file.

    webpack.common.js

    const path = require('path');
    
    module.exports = {
      entry: {    
        login: './src/login.ts',
        purchase: './src/purchase.ts'   
      },
      devtool: 'inline-source-map',
      module: {
        rules: [
          {
            test: /\.tsx?$/,
            use: 'ts-loader',
            exclude: /node_modules/
          }
        ]
      },
      resolve: {
        extensions: [ '.tsx', '.ts', '.js' ]
      },
      output: {
        filename: '[name].js',
        path: path.resolve(__dirname, 'dist')
      }
    };
    

    webpack.dev.js

    const merge = require('webpack-merge');
    const webpack = require('webpack');
    const common = require('./webpack.common.js');
    
    module.exports = merge(common, {
      mode: 'development',
      plugins: [
        new webpack.DefinePlugin({
          'process.env': {
            'API_URL': JSON.stringify("http://127.0.0.1:8000/")
          }
        })
      ]
    });
    

    webpack.staging.js

    const merge = require('webpack-merge');
    const webpack = require('webpack');
    const common = require('./webpack.common.js');
    
    module.exports = merge(common, {
      mode: 'production',
      plugins: [
        new webpack.DefinePlugin({
          'process.env': {
            'API_URL': JSON.stringify("https://staging.com/")
          }
        })
      ]
    });
    

    webpack.prod.js

    const merge = require('webpack-merge');
    const webpack = require('webpack');
    const common = require('./webpack.common.js');
    
    module.exports = merge(common, {
      mode: 'production',
      plugins: [
        new webpack.DefinePlugin({
          'process.env': {
            'API_URL': JSON.stringify("https://production.com")
          }
        })
      ]
    });
    

    I wrote this blog post originally thinking I had solved this problem, but the solution I was using can only handle development and production environments. It also is a little more complicated than the above solution. It is described here: https://basarat.gitbooks.io/typescript/content/docs/tips/build-toggles.html

  5. Debugging PostgreSQL Port 5433 and Column Does Not Exist Error

    I am creating a Django application using PostgreSQL (PSQL) for my database and was nearly finished with the API when I discovered some strange behavior. After successfully testing the API in the Django app, I decided to run some basic queries on the database. I received the following error for nearly every field in the app:

        select MaxSceneKey from game_progress_gameplaykeys;
        ERROR:  column "maxscenekey" does not exist
        LINE 1: select MaxSceneKey from game_progress_gameplaykeys;
                       ^
        HINT:  Perhaps you meant to reference the column "game_progress_gameplaykeys.MaxSceneKey".
    

    I was getting the same result for every field in the table that I tried (and when I try to include the table name as the hint suggests), except for ‘user_id’ and ‘objective'.

    I confirmed that the fields existed using \d+ game_progress_gameplaykeys, tried changing some of their field types, and even upgraded from Postgres 9.5 to 10.5 (I was planning to do this anyway).

    After a bunch of searching, I found the issue:

    “All identifiers (including column names) that are not double-quoted are folded to lower case in PostgreSQL.” from https://stackoverflow.com/questions/20878932/are-postgresql-column-names-case-sensitive

    I created camelCase field names in my Django app based on what the field names were previously in my application (written in C#).

    I decided to fix this (for now) by fixing my models to all use snake_case and using https://github.com/vbabiy/djangorestframework-camel-case to switch the keys from camelCase to snake_case when they come into the API. One issue solved!

    While debugging that issue, I decided to update my laptop’s code + postgres version since I hadn’t worked on it in a while and wanted to see if the issue was just on my desktop. When I reinstalled PSQL, I couldn’t seem to log into it using the user I was creating. Using the postgres user was fine, though.

    I finally figured out the issue was that PSQL was running on port 5433, not 5432 (the default). After that, I was puzzling over what could be running on 5432 since ‘netstat’ and ‘lsof’ revealed nothing else running on my WSL Ubuntu VM. As I was searching around, I saw someone mention that really only PSQL should be running on that port, and I realized I had installed PSQL on Windows on that machine before I moved over to WSL. I uninstalled that, switched back to 5432 in Linux, restarted PSQL, and boom, good to go.

    While I was debugging that issue, I learned some good information about PSQL along the way:

    /etc/postgresql/10/main/postgresql.conf allows you to set and check the port that PSQL is running on.

    /etc/postgresql/10/main/pg_hba.conf allows you to set different security protocols for connections to PSQL. Notable for local development: set the local connection lines to ‘trust’ so you don’t have to enter a password when logging in.

    Note: you need to restart the PSQL server for either of these changes to take effect. Note 2: MORE IMPORTANT NOTE: Don’t use trust anywhere other than a local version of PSQL. Ever.

    These are the lines I had to change to get that to work (may be different in versions of PSQL other than 10.5):

    # "local" is for Unix domain socket connections only
    local   all             all                                     trust
    # IPv4 local connections:
    host    all             all             127.0.0.1/32            trust
    # IPv6 local connections:
    host    all             all             ::1/128                 trust
    
  6. Configuring NGINX for localhost

    I had a little trouble finding a simple way to set NGINX up to work locally, so I wanted to write up some quick instructions here. I’m using NGINX with Windows Subsystem for Linux (WSL).

    First, I installed NGINX in WSL with ‘sudo apt-get install nginx’

    Then, I created a symlink to my frontend directory in my home directory in WSL.

    In /etc/nginx/conf.d, I created basic config file localhost.conf:

    server {
        listen       8080;
    
        location / {
            root   /home/username/frontend_directory;
            index  index.html index.htm;
        }
    
        # redirect server error pages to the static page /50x.html
        error_page   500 502 503 504  /50x.html;
        location = /50x.html {
            root   html;
        }
    }
    

    The only thing you should need to change is the bolded frontend_directory. Keep in mind the path may differ depending where you keep your files. I restructured my front end after setting up Webpack to include a dist/ folder and broke this config before modifying this line again.

    To start NGINX (I have to do this every time I restart the computer): ‘sudo nginx’

    Then go to 127.0.0.1:8080 and your site should be live!

    If you need to make changes to your configuration, you should first try: ‘sudo service nginx reload’ which will give you a ‘hot’ reload instead of restarting the server.

    If you do need to restart the server, you can do so with ‘sudo service nginx restart’.

  7. Sending JSON to a server using fetch() in TypeScript

    For a new project, I wanted to use TypeScript on the front end but not any of the frameworks that usually include it (React, Angular, etc.). Unfortunately, this means that when I have been trying to figure out how to do something in TypeScript, searches often lead me to solutions involving those frameworks.

    I still haven’t found a good resource for creating a JSON object and sending it to a backend using TypeScript. The easiest solution would be to relax the TypeScript compiler and writing it the same way we would in JavaScript, but that defeats the point of using TypeScript. In looking at example code, I found that creating an interface to describe the JSON object is one accepted way to do it.

    interface IJSON
    {
        email:string;
        fullName: string; 
        shortName: string; 
        password: string; 
        institution: string; 
        isStudent: boolean;
    }
    
    const url = 'http://127.0.0.1:8000/register/';
    
    function gatherData(e:Event)
    {
        e.preventDefault();  //don't reload page so that we can test.
    
        let json:IJSON = 
        {
            email: (<HTMLInputElement>document.getElementById("email")).value,
            fullName: (<HTMLInputElement>document.getElementById("fullName")).value,
            shortName: (<HTMLInputElement>document.getElementById("shortName")).value,
            password: (<HTMLInputElement>document.getElementById("password")).value,
            institution: (<HTMLInputElement>document.getElementById("institution")).value,
            isStudent: true,
        }
        sendDataViaFetch(json);
    }
    
    function sendDataViaFetch(json:IJSON)
    {    
        var request = new Request(url, {
            method: 'POST',
            body: JSON.stringify(json),
            headers: new Headers({
                'Content-Type': 'application/json',
                'Authorization': this.basic })
        });
    
        fetch(request)
        .then(function() {
            // Handle response we get from the API
        });
    }
    
    window.addEventListener('submit', gatherData);
    

    If you have a better way, please let me know!

  8. Python Reference Talks

    While trying to get more familiar with Django, I started watching talks from DjangoCon from the last few years. I can’t seem to find the talk, but one of them had a list of great Python/Django talks, which inspired me to create me own list (with some definite overlap).

    I have found that revisiting talks like these make me reconsider some design problems that I have recently worked through, so I want to keep a list and rewatch these periodically. I will likely add to this list in the future.

    These two go together (from DjangoCon2015): - Jane Austen on Pep 8 - Lacey Williams Henschel - The Other Hard Problem: Lessons and Advice on Naming Things by Jacob Burch

  9. Setting up Conda, Django, and PostgreSQL in Windows Subsystem for Linux

    Because I feel much more comfortable in a terminal than on the Windows command line (or in Powershell), I’ve really been enjoying Windows Subsystem for Linux (WSL). In fact, I use it almost exclusively for accessing the server I run this blog from. WSL is essentially a Linux VM of only a terminal shell in Windows (with no GUI access to Linux) and no lag (like you get in most VMs).

    When I created my Grocery List Flask App, I began by using WSL. However, I ran into an issue that prevented me from seeing a locally hosted version of the API in Windows, so I switched to the Windows command line for that app.

    Recently, I’ve been developing a Django application (more on that in a future post), and I ran into a similar issue. Between posting the issue with localhost on WSL and starting this new app, there was a response that I had been meaning to check out. I found that for Django and PostgreSQL, making sure everything was running from localhost (or 0.0.0.0) instead of 127.0.0.x, seemed to fix any issues I had. PSQL gave me some issues just running within WSL, but I found that I just need to add '-h localhost' to get it to run.

    Below are the commands I used to get Conda, Django, and PSQL all set up on my PC and then again my laptop. This works for Django 2.0, PSQL 9.5, and Conda 4.5.9.

    Installation Instructions

    Edit: I originally had installation instructions in here for PSQL 9.5. If you want 9.5 in Ubuntu, good news! You already have it. To install the newest version of PSQL, you should uninstall that version first, then install the new version from here

    Install Conda (need to restart after installing for it to recognize ‘conda’ commands)

    #create environment
    conda create --name NameOfEnvironment
    #activate environment
    source/conda activate NameOfEnvironment
    #install Django
    conda install -c anaconda django
    #install psycopg2, to interface with PSQL
    conda install -c anaconda psycopg2
    
    If you get permission denied, or it hangs, just rerun the install command that failed. Not sure why, but that fixed things for me.
    
    #Remove PSQL from Ubuntu:
    sudo apt-get --purge remove postgresql\*
    #Then run this to make sure you didn’t miss any:
    dpkg -l | grep postgres
    
    #Install PSQL 10 using instructions here: https://www.postgresql.org/download/linux/ubuntu/ I have copied them here for convenience, but please double check that they have not changed
    #Create the file /etc/apt/sources.list.d/pgdg.list and add the following line:
    #deb http://apt.postgresql.org/pub/repos/apt/ xenial-pgdg main
    
    #Then exeucte the following three commands
    wget --quiet -O - https://www.postgresql.org/media/keys/ACCC4CF8.asc | sudo apt-key add -
    sudo apt-get update
    sudo apt-get install postgresql-10
    
    sudo service postgresql start    
    sudo -i -u postgres -h localhost
    createuser --interactive
           psql user: local_user
           y to superuser
    
    #create the database
    createdb local_db
    #log into local_db
    psql -d local_db
    
    #privileges for Django to modify tables.
    GRANT ALL PRIVILEGES ON DATABASE local_db TO local_user;
    
    ALTER USER local_user WITH PASSWORD 'password';
    
    '\q' to quit interactive console.
    'exit' to leave postgres as postgres user.
    
    #one line command to log in as the user to check tables during development.
    psql -h localhost -d local_db -U local_user
    
    python manage.py makemigrations
    python manage.py migrate
    
    Now log back in to PSQL using the line above, then enter '\dt' and you should see tables like django_admin_log, django_content_type, django_migrations, and django_sessions. Your PSQL DB is now connected to your Django app!
    
    #optional for now, but allows you to ensure db connection works by storing credentials for the superuser you create.
    python manage.py createsuperuser
    
    #command to run the server. go to localhost:8000 in your web browser to view!
    python manage.py runserver 0.0.0.0:8000
    

    I used this post for reference

Page 1 / 2 »