From e01780962c3a96e931a06123e47e621e5d7f6bc1 Mon Sep 17 00:00:00 2001 From: FaresSalem Date: Fri, 27 Mar 2020 04:04:55 +0200 Subject: [PATCH 1/4] Adding Hardware Security as a new sub-research area (#596) * Create README.md * Rename security/HW Security/README.md to security/hardware_security/README.md * Rearranging files * Rearranging files * Delete sok-eternal-war-in-memory.pdf * Moving sok-eternal-war-in-memory.pdf * Fix dead link Updating the link for "Internet Census via Insecure Routers" * Add Hardware Security subsection --- security/README.md | 16 +++++++++++----- 1 file changed, 11 insertions(+), 5 deletions(-) diff --git a/security/README.md b/security/README.md index 0ad82bb..cf9fec4 100644 --- a/security/README.md +++ b/security/README.md @@ -1,17 +1,23 @@ -## Security - + Security +=========== * [Reflections on Trusting Trust (1984)](http://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thompson.pdf) -* [Internet Census via Insecure Routers (2012)](https://internetcensus2012.bitbucket.io/paper.html) +* [Internet Census via Insecure Routers (2012)](https://www.researchgate.net/publication/279069631_The_Internet_Census_2012_Dataset_An_Ethical_Examination) * [Looking inside the (Drop) Box (2013)](https://www.usenix.org/system/files/conference/woot13/woot13-kholia.pdf) * [Making Programs Forget: Enforcing Lifetime For Sensitive Data (2011)](https://www.usenix.org/events/hotos11/tech/final_files/Kannan.pdf) * [Breach: Reviving The Crime Attack (2013)](http://breachattack.com/resources/BREACH%20-%20SSL,%20gone%20in%2030%20seconds.pdf) * [Why Silent Updates Boost Security (2009)](http://www.techzoom.net/Papers/Browser_Silent_Updates_%282009%29.pdf) * [A survey of coordinated attacks and collaborative intrusion detection (2010)](https://www.tk.informatik.tu-darmstadt.de/fileadmin/user_upload/Group_TK/zhou2010survey.pdf) -* [Meltdown (2018)](https://meltdownattack.com/meltdown.pdf) -* [Spectre Attacks: Exploiting Speculative Execution (2018)](https://spectreattack.com/spectre.pdf) * :scroll: [Macaroons: Cookies with Contextual Caveats for Decentralized Authorization in the Cloud (2014)](macaroons-cookies-with-contextual-caveats.pdf) * :scroll: [Insertion, Evasion, and Denial of Service: eluding network intrusion detection (1998)](ids-evasion-ptacek-newsham.pdf) + +## Hardware Security + +* [Meltdown (2018)](https://meltdownattack.com/meltdown.pdf) +* [Spectre Attacks: Exploiting Speculative Execution (2018)](https://spectreattack.com/spectre.pdf) +* [DRAM Row Hammer (2014)](https://people.inf.ethz.ch/omutlu/pub/dram-row-hammer_isca14.pdf) + - Flipping Bits in Memory Without Accessing Them: An Experimental Study of DRAM Disturbance Errors + * :scroll: [SoK: Eternal War in Memory (2013)](sok-eternal-war-in-memory.pdf) - Classifies memory attacks into a taxonomy that is usable by both black- and white-hats. - An excellent primer on the different memory-related vulnerabilities that exist, (more importantly) why they exist, and the ways in which various defences act to counter them. From 276ecb8644ca5b6aa438bf1c40cb293741c8cb98 Mon Sep 17 00:00:00 2001 From: Sean Broderick Date: Sat, 28 Mar 2020 00:25:52 -0400 Subject: [PATCH 2/4] fix link in machine_learning (Top 10 algorithms in data mining) --- machine_learning/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/machine_learning/README.md b/machine_learning/README.md index c3b94b1..10f6140 100644 --- a/machine_learning/README.md +++ b/machine_learning/README.md @@ -3,7 +3,7 @@ ## External Papers -* [Top 10 algorithms in data mining](http://www.cs.uvm.edu/~icdm/algorithms/10Algorithms-08.pdf) +* [Top 10 algorithms in data mining](https://www.researchgate.net/publication/29467751_Top_10_algorithms_in_data_mining) While it is difficult to identify the top 10, this paper contains 10 very important data mining/machine learning algorithms From c1debdd00dab081559772627a8825ae90802fd73 Mon Sep 17 00:00:00 2001 From: Ane Berasategi Date: Sat, 28 Mar 2020 05:37:04 +0100 Subject: [PATCH 3/4] Deleted discontinued chapters (#598) The links don't exist anymore --- README.md | 11 ----------- 1 file changed, 11 deletions(-) diff --git a/README.md b/README.md index f50c982..94b000e 100644 --- a/README.md +++ b/README.md @@ -23,19 +23,15 @@ Here are our official chapters. Let us know if you are interested in [starting o * [Bhubaneswar](https://www.facebook.com/groups/pwlbbsr/) * [Boston](http://www.meetup.com/Papers-We-Love-Boston-Cambridge/) * [Brasilia](http://www.meetup.com/papers-we-love-bsb) -* [Boulder](http://www.meetup.com/Papers-We-Love-Boulder/) * [Bucharest](http://www.meetup.com/papers-we-love-bucharest/) * [Buenos Aires](https://paperswelove.org/buenos-aires/) * [Cairo](http://www.meetup.com/Papers-We-Love-Cairo/) * [Chattanooga](http://www.meetup.com/Papers-We-Love-Chattanooga/) * [Chicago](http://www.meetup.com/papers-we-love-chicago/) * [Columbus, Ohio](http://www.meetup.com/Papers-We-Love-Columbus/) -* [Dallas](http://www.papersdallas.com/) * [Gothenburg](https://www.meetup.com/Papers-We-Love-Gothenburg/) -* [Guadalajara](https://www.facebook.com/pwlgdl/) * [Hamburg](http://www.meetup.com/Papers-We-Love-Hamburg/) * [Hyderabad](http://www.meetup.com/papers-we-love-hyderabad/) -* [Iasi](http://www.meetup.com/Papers-We-Love-Iasi/) * [Iowa City](https://www.meetup.com/techcorridorio) * [Kathmandu](https://www.facebook.com/groups/PapersWeLoveKathmandu/) * [Kyiv](https://www.facebook.com/groups/PapersWeLoveKyiv) @@ -43,18 +39,11 @@ Here are our official chapters. Let us know if you are interested in [starting o * [London](http://www.meetup.com/papers-we-love-london) * [Los Angeles](http://www.meetup.com/papers-we-love-la) * [Madrid](http://www.meetup.com/Papers-We-Love-Madrid/) -* [Medellín](https://www.meetup.com/paperswelovemde/) * [Montreal](http://www.meetup.com/Papers-We-Love-Montreal/) -* [Mumbai](https://www.meetup.com/Papers-We-Love-Mumbai/) -* [Munich](http://www.meetup.com/Papers-We-Love-Munich/) * [New York City](http://www.meetup.com/papers-we-love/) * [Paris](http://www.meetup.com/Papers-We-Love-Paris/) -* [Philadelphia](http://www.meetup.com/Papers-We-Love-Philadelphia/) -* [Portland](http://www.meetup.com/Papers-We-Love-PDX/) -* [Porto](https://www.meetup.com/Papers-We-Love-Porto) * [Pune](http://www.meetup.com/Doo-Things) * [Raleigh-Durham](https://www.meetup.com/Papers-We-Love-Raleigh-Durham/) -* [Reykjavík](http://www.meetup.com/Papers-We-Love-Reykjavik) * [Rio de Janeiro](https://www.meetup.com/pt-BR/papers-we-love-rio-de-janeiro/) * [San Diego](http://www.meetup.com/Papers-We-Love-San-Diego/) * [San Francisco](http://www.meetup.com/papers-we-love-too/) From d8c4b140a24f00d0491921f1d4bf98cf6c7fa720 Mon Sep 17 00:00:00 2001 From: christoshadjiaslanis Date: Sat, 28 Mar 2020 04:45:44 +0000 Subject: [PATCH 4/4] Added script to download all (pdf) papers locally (#597) * Added script to download all PDFs from the Readmes * Removed sleep * Formatting * Added guard closes and some docs to download script. Added it to scripts folder. Added download script readme. Added section in root readme. * Removed old download_all.sh * Added support for specifying which directories you want to download. * Removed dependency on xargs. * Changed filename to download.sh. Updated READMEs. * More README * Fixed download.sh logic for multiple arguments. Removed Readme section about executing script from anywhere. Updated the parsing of URLs to be more specific. --- README.md | 12 ++++++++++++ scripts/README.md | 22 ++++++++++++++++++++++ scripts/download.sh | 46 +++++++++++++++++++++++++++++++++++++++++++++ 3 files changed, 80 insertions(+) create mode 100644 scripts/README.md create mode 100755 scripts/download.sh diff --git a/README.md b/README.md index 94b000e..91e9310 100644 --- a/README.md +++ b/README.md @@ -108,6 +108,18 @@ Reading a paper is not the same as reading a blogpost or a novel. Here are a few * Love a Paper - [@loveapaper](https://twitter.com/loveapaper) +### Download papers + +Open your favourite terminal and run: + +```bash +$ ./scripts/download.sh +``` + +This will scrape markdown files for links to PDFs and download papers to their respective directories. + +See [README.md](./scripts/README.md) for more options. + ## Contributing Guidelines Please take a look at our [CONTRIBUTING.md](https://github.com/papers-we-love/papers-we-love/blob/master/.github/CONTRIBUTING.md) file. diff --git a/scripts/README.md b/scripts/README.md new file mode 100644 index 0000000..bb4e7e5 --- /dev/null +++ b/scripts/README.md @@ -0,0 +1,22 @@ +# Scripts + +Scripts for working with repository content. + +## Download Utility +A convenience script to download papers. This will scrape the README.md files for URLs containing links to pdfs and download them to their respective directories. + +The download utility is idempotent and can be run multiple times safely. + +### Usage +Open your favourite terminal and run: + +```bash +$ ./scripts/download.sh +``` + + +Optionally, to download specific topics specify their directories as arguments: + +```bash +$ ./scripts/download.sh android concurrency +``` diff --git a/scripts/download.sh b/scripts/download.sh new file mode 100755 index 0000000..d5139d4 --- /dev/null +++ b/scripts/download.sh @@ -0,0 +1,46 @@ +#!/bin/bash + +# Guard clause check if required binaries are installed +which wget > /dev/null || { echo "Error: wget not installed." ; exit 1 ; } +which egrep > /dev/null || { echo "Error: egrep not installed." ; exit 1 ; } + +# Recursively traverse directories in repo scraping markdown file for URLs +# containing pdfs. Downloads pdfs into respective directories. +download_for_directory() { + cd $1 || { echo "Error: directory not found." ; exit 1 ; } + + for f in *; do + if [[ -d ${f} ]]; then + download_for_directory ${f} & + fi + done + + # Scrape URLs from markdown files + urls=$(ls | cat *.md 2> /dev/null | egrep -o 'https?://[^ ]+' | grep '\.pdf' | tr -d ')') + + for url in "$urls"; do + # Ignore empty URLs + if [[ ! -z ${url} ]]; then + wget ${url} --no-clobber --quiet --timeout=5 --tries=2 + fi + done + + cd .. + echo "$1 done." +} + +# If no directories are supplied, iterate over the entire repo. +if [[ "$#" -eq 0 ]]; then + REPO_ROOT_DIR="$(dirname $0)/.." + download_for_directory ${REPO_ROOT_DIR} +else +# Iterate over the specified directories + for dir in "$@" + do + download_for_directory ${dir} + done +fi + +# Wait for child processes to terminate +wait +