Author: Chris Lattner
To the members of the MIT community:
We are writing to inform you of plans to upgrade the MIT campus network, and in particular to upgrade MIT to the next generation of Internet addressing. (Please note that no action is required on your part.)
Machines on the Internet are identified by addresses. The current addressing scheme, called IPv4, was specified around 1980, and allowed for about 4 billion addresses. That seemed enough at the time, which was before local area networks, personal computers and the like, but the Internet research community recognized around 1990 that this supply of addresses was inadequate, and put in place a plan to replace the IPv4 addresses with a new address format, called IPv6. IPv6 uses a 128-bit address scheme and is capable of 340 undecillion addresses (340 times 10^36, or 340 trillion trillion trillion possible IP addresses). This stock of addresses allows great flexibility in how addresses are assigned to hosts, for example allowing every host to use a range of addresses to
image: docker.mydomain.com/build/kube-go-make | |
variables: | |
DOCKER_TAG: docker.mydomain.com/myapp/home:$CI_COMMIT_REF_SLUG | |
DOCKER_HOST: tcp://localhost:2375 | |
DOCKER_DRIVER: overlay | |
PROD_RSYNC_HOST: myprodserver.com | |
DOMAIN: mydomain.com | |
CHART_DIR: chart |
console.log(1); | |
(_ => console.log(2))(); | |
eval('console.log(3);'); | |
console.log.call(null, 4); | |
console.log.apply(null, [5]); | |
new Function('console.log(6)')(); | |
Reflect.apply(console.log, null, [7]) | |
Reflect.construct(function(){console.log(8)}, []); | |
Function.prototype.apply.call(console.log, null, [9]); | |
Function.prototype.call.call(console.log, null, 10); |
I screwed up using git ("git checkout --" on the wrong file) and managed to delete the code I had just written... but it was still running in a process in a docker container. Here's how I got it back, using https://pypi.python.org/pypi/pyrasite/ and https://pypi.python.org/pypi/uncompyle6
apt-get update && apt-get install gdb
import sqlite3 | |
import tldextract | |
history_domains = set() | |
cf_domains = None | |
print("Loading domains from Chrome browsing history...") | |
# Copy history from ~/Library/Application Support/Google/Chrome/Default/History | |
conn = sqlite3.connect('History') |
Disclaimer: This piece is written anonymously. The names of a few particular companies are mentioned, but as common examples only.
This is a short write-up on things that I wish I'd known and considered before joining a private company (aka startup, aka unicorn in some cases). I'm not trying to make the case that you should never join a private company, but the power imbalance between founder and employee is extreme, and that potential candidates would
I recently received a job offer from Amazon AWS. The pay and benefits are quite decent. I do really like the role and it would be work that interests me. The people that I have met so far in interviews are great.
What I do not like is the documents I have to sign to get started. They include stuff like this:
"ATTENTION AND EFFORT. During employment, Employee will devote Employee’s entire productive time, ability, attention, and effort to
Integrating third party JavaScript libraries not written with Google Closure Compiler in mind continues to both be a source of error for users when going to production, and significant vigilance and effort for the the broader community (CLJSJS libraries must provide up-to-date and accurate externs).
In truth writing externs is far simpler than most users imagine. You only need externs for the parts of the library you actually intend to use from ClojureScript. However this isn't so easy to determine from Closure's own documentation. Still in the process of writing your code it's easy to miss a case. In production you will see the much dreaded error that some mangled name does not exist. Fortunately it's possible to enable some compiler flags :pretty-print true :pseudo-names true
to generate an advanced build with human readable names. However debugging missing externs means compiling your production build for each missed case. So much time wasted for such simple mistakes damages our sen