Category: Information

Update: Three more podcasts I started listening to

Last November I admitted I started listening to podcasts and I shared my list of two essential and two extra urbanism podcasts. Since then I’ve added three more podcasts to my rotation.

a photo of University Center in the South Loop, behind a Green Line elevated train that's headed north.
Chicago urbanist. High-rise student housing and conference center, a 131-year-old elevated transit line (although running over the Harrison Curve track that was built in 2003 to replace two 90-degree turns), and an undeveloped surface parking lot.

(Links go to Apple Podcasts.)


City Dweller

Listen to the episodes where they interview my friend Eric Allix Rogers about what he appreciates in Chicago, and my sometimes conspirator Emily Talen (also an urban geography professor at University of Chicago). Other Chicagoans, Natalie Moore and Mary Wisniewski, have also been interviewed. Episodes are short!

Odd Lots

This is one of Bloomberg media’s podcasts, with hosts Tracy Alloway and Joe Weisenthal. I mostly appreciate the episodes where they explain financial topics I still have a hard time understanding, and I really liked the recent episode where they interviewed Saule Omarova to talk about the FDIC and the Federal Reserve.

Tracy and Joe also interviewed Stephen Smith and Bobby Fijan to talk about apartment building designs, unit layouts, and why the double-loaded corridor is fine for hotels but not fine building apartments for families.

The War on Cars

This should have gone in my previous post because this is another urbanism podcast. The title is a pretty good summary and the three hosts – Sarah Goodyear, Aaron Naparstek, who cofounded Streetsblog, and Doug Gordon – discuss the many, many ways that cars ruin cities.

A kludge to build a rental registry in Cook County 

Chicago should have a rental registry, a database of dwelling units that are rented to tenants, for at least two reasons:

  1. The city can know things about the rental units, including how much they cost, where they are, and if any are vacant and could be occupied if only people knew they were available and how to get in touch with the owner.
  2. The city can know who the owners are and contact them to issue citations or advise them, or fill out for them, emergency rental assistance during pandemics and other times of necessity.

Building and administering a rental registry from scratch would be very expensive – probably tens of millions to start and more than one million annually.

I propose a kludge that uses existing databases and modifies existing standard operating procedures amongst a small group of Cook County and Chicago agencies. A kludge is a workaround. It has other meanings and an uncertain etymology.

An ideal rental registry helps solve at least four problems:

  1. Identify who owns a rental home
  2. The number of rental units are in a building
  3. Rental price
  4. Rental unit availability [see my other blog post about counting vacant units]
A 9-unit apartment building in Little Italy is undergoing renovation.

The kludge has four parts

1. Incorporate data about the number of units declared on Real Estate Transfer Tax forms (which in Cook and many other counties are transmitted to the Illinois Department of Revenue digitally).

There is already a city office that reviews or audits these forms looking for instances where the buyer or seller incorrectly claimed certain exemptions from RETT, because of how the city can lose revenue. That office can also enforce that the number of units was correctly entered on the form. 

2. For banks that hold city deposits, amend legislation to require that their newly issued or refinanced mortgages specify the number of units in the required submitted documentation. The ordinance that regulates banks that hold city deposits was amended a few years ago to require that they report how many loans they issue in Chicago for both commercial and residential properties.

Databases 1 and 2 are checks for each other. 

3. “Hire” the Cook County Assessor’s Office to create and operate the database for the unit count data from 1 and 2 (likely as an augmentation of their existing database).

The database would also store any data the CCAO collects through the commercial valuation data they obtain from third party sources as well as from the owners who volunteer it (Assessor Kaegi is already collecting and publicly publishing this information). 

At this point, with features 1, 2, and 3, we are assembling a pretty broad but incomplete record of where rental units are. It will be come more complete over time as properties transfer (sell) and the details of the transfer (sale), and the properties themselves, are recorded.

It doesn’t have a clue as to the rental prices

4. The Cook County Assessor’s Office creates new property classifications. Property classifications allow for the comparison of like buildings for the purpose of establishing assessed values for all properties that are not tax exempt.

One of the most common classifications in Chicago is “2-11”, for apartment buildings with two to six units. This means that, generally, the value of the ubiquitous two-flats and three-flats get compared to other each other and sometimes to four-flats, etc.

I suggest that there should be a few new property classifications, but I have only one idea so far: classify limited equity and Chicago Housing Trust properties differently. 

Bickerdike is one organization that built a lot of limited equity row houses and detached houses in the 1990s and 2000s but I am not aware of a publicly accessible database identifying them.

These houses represent permanently affordable housing and we should have a better system to track them!

This screenshot of part of a spreadsheet is the apartments data that the Cook County Assessor’s Office collected for the 2021 tax year. 

How broad is the kludge?

  • Using the Real Estate Transfer Tax data from 2022 Q1 to Q3, there were 3,550 buildings in Chicago having 22,217 units transferred. (I don’t know how many were arms length transactions, meaning they were sold to new owners.)
  • In the CCAO’s apartments data collected for the Rogers Park Township, there is semi-detailed information about 715 buildings that have seven or more apartments comprising 18,541 units. Details include the unit size breakdown by bedroom count.

Chicago has 556,099 rented dwelling units in buildings with two or more units (according to the ACS 2021 1-year estimate). In my limited analysis I’ve already found data about 7.4 percent of them, and that’s only for part of the city [1].

Notes, limitations, and updates

[1] There may also be duplicates between the buildings in the RETT database and the CCAO apartments dataset.

These databases would not have information about detached (“single family”), single-unit semi-detached (rowhouses and townhouses), and condos used as rentals. This severely limits the coverage of information. As it stands, Chicago Cityscape has data coverage of unit count information for about 37 percent of multi-family (apartment) buildings.

5th Ward Alderperson Desmond Yancy proposed an ordinance that would establish a rental registry (O2023-0004085). The rationale for such is shown in the screenshot below. (Go directly to the ordinance’s PDF.)

Screenshot of the proposed rental registry benefits.

Determining dwelling unit count at an address using 1950 Census records

It’s possible to use 1950 U.S. Census records to establish a number of historic dwelling units at a building in Chicago for the purposes of the city generating a “zoning certificate”. That’s a recognition of the number of legal dwelling units especially useful when that number of dwelling units is greater than the current zoning code allows. They’re required to be generated (i.e. requested from the city’s planning department) during the sale of a small-scale residential property.

This Chicago zoning certificate was generated the “normal” way (which I don’t know how to describe), while this blog post is about digging up evidence in case the normal way doesn’t affirm the number of dwelling units you want it to affirm.

However, based on a recent experience of a client of mine, the zoning certificate – while supposedly valid for one year – is subject to dispute later! Finding Census records showing the same or more dwelling units in a given building has helped re-establish the validity of the number of units stated in a zoning certificate.

It will take you some time to research Census records! (I would budget at least one hour. There is a painstaking process to find the webpage that has the enumeration (counting) sheets for the address you want, and it will take some time to sift through those sheets (and you may have to look at multiple pages for the same address).

Here are the steps involved

  1. Find the enumeration district (ED).
  2. Review that ED’s set of sheets in the National Archives 1950 Census website.
  3. Page through and read every sheet until the address is found.

Tip: I recommend doing this, especially step 3, on a computer with the largest screen possible as it’s easier to view the scanned Census sheets that way.

1. Find the enumeration district

The New York Public Library has a great tutorial on their blog, and I recommend you start by reading section #2, “Generate an ED number”.

Once you locate the ED (following the instructions in “Generate an ED number”) I would say stop reading the NYPL blog post.

Alternatively, for Chicago address lookups, you can browse these maps of the enumeration districts and then search for those ED numbers directly on the National Archives website.

2. Review that ED’s set of sheets

The instructions will advise you to use Steve Morse’s third-party Unified Census ED Finder (which is the most useful part of this process). The ED Finder has a wizard asking you to select state, county, city, and street name and cross street. It will then produce a set of one or more ED numbers under the heading, “1950 ED numbers corresponding to your location”.

A screenshot of the ED Finder “wizard”.

Tip: ED Finder will likely show you multiple ED numbers; you may need to look at all of them to find an address. I think this happens because the link between each enumeration district and a city’s streets is imprecise

Click each ED number, which opens a new tab in the ED Finder website with links to three different image viewers. These are databases where scans of microfilm are shown online. One of them, NARA, is public – that’s the National Archives & Records Administration. I recommend that one as it’s free and doesn’t require an account.

3. Read each enumeration sheet

Once you arrive on the NARA website click on the “Population Schedules” to reveal the enumeration sheets. Start paging and reading!

Tip: Street names are written vertically on the left edge of the page.

A screenshot of the 1950 Census database and website of NARA. It’s necessary to click the “population schedules” button to display the enumeration sheets.

The NARA viewer isn’t the best – it doesn’t allow you to make it full-screen; I was constantly having to zoom in to read the street names and house numbers.

A screenshot of a 1950 Census enumeration sheet with two annotations.

Creating a PostgreSQL PostGIS function to get around a DataTables Editor limitation

DataTables is a fantastic software that turns any HTML table into an interactive and editable one.

Chicago Cityscape uses DataTables on pretty much every page. DataTables also provides server-side processing for PHP to grab the right data from a database and show it to the user, 10 records at a time (the number of records can be changed by the user to show more records at a time).

Screenshot showing my new function, using the function, and the results.

Problem

One of the problems I’ve had to get around is that the DataTables Editor script recognizes a SELECT statement with only one function per field. If there’s a second function that’s manipulating a field in the SELECT statement then the ALIAS won’t work and the JavaScript will not be able to read and show the data properly.

I almost always use the two functions ST_AsGeoJSON and ST_Transform on my geographic data to (1) transform it from the SRID of 3435 to 4326 for displaying on web mercator maps, and (2) converting the GIS data into GeoJSON strings for easy transference to the Leaflet JavaScript library.

This is a known issue – see this thread on the DataTables forums – that Allan, the DataTables creator, has acknowledged and provided an alternative solution for.

Solution

It turns out that it’s easy to write a function that combines both functions.

CREATE OR REPLACE FUNCTION ST_TAGJP (field geometry, srid int =4326, simplify int =5)
  RETURNS text
RETURN ST_AsGeoJSON(ST_Transform(field, srid), simplify);

The code above is a function I call ST_TAGJP that combines the two functions I already described, and is flexible by letting the user specify in the arguments the table and field, the SRID to transform (reproject) to, and the simplify variable that can be used in ST_AsGeoJSON that tells it how many decimal points to use in coordinates.

Avoid “VARCHAR” columns in PostgreSQL [PGSQL Phriday #006]

This is a response to Grant.

This is one thing I wish I knew while learning PostgreSQL: Don’t ever use the VARCHAR column type. I have never encountered a situation where that would have been superior to using TEXT.

The problem with VARCHAR column type is that it can be a fixed-width column that doesn’t allow text longer than a certain number of characters. This maximum character length can be introduced by the person who creates the table and adds the column.

The maximum character length can also be created by a GIS program (specifically, QGIS and the ogr2ogr command line tool) as well as some online tools that can convert spreadsheets to SQL and even write the table definition and “CREATE TABLE” statement. In both cases, the program tried to guess the column type and seemed to default to VARCHAR(y character length) instead of TEXT.

The maximum character length is a problem because trying to insert more text than allowed into that field will encounter an error and the INSERT or UPDATE statement will fail. I think it’s also a bad idea to create these columns because it makes it expensive to change your mind later about what length of text will go into that field.

Other things I wish I knew earlier

  • Using transactions to test your query and its affect on your data so that you can easily undo any changes.
  • How and when to use indexes beyond the “btree” index type (for example, “gin” for array and JSONB columns).
  • Add “time_created_at” and “time_updated_at” columns to basically every dataset. This makes it easy to sort things later; sometimes when creating a new dataset it’s not always obvious how you might want to display or manipulate the data until days, weeks, or months later. Sorting data by “when was the information changed last” is a common use case for me.