Hacker Newsnew | past | comments | ask | show | jobs | submit | tonywebster's commentslogin

I think bringing attention to common practices putting people in jeopardy is a good thing, because it’s likely website operators and the general public don’t think about these risks. If data is collected, it can be used by police and prosecutors. Ignoring the problem because everyone knows about analytics does not start necessary conversations. Google has increasingly been a target for law enforcement investigations, and the search warrants Google receives can be as broad as compelling Google to turn over data on the accounts or IP addresses of anyone who Googled someone’s name.[1]

[1] https://www.techdirt.com/2017/03/17/judge-grants-search-warr...


Shouldn't this be a discussion about how law enforcement search powers are getting out of control, then?


That’s another worthwhile topic, but since the privacy threat model of a companies selling some types of birth control will be unusual for the foreseeable future, it is also worth talking about in isolation.


It also isn’t just limited to Google; I wonder how these companies protect their users from payment processor data collection…


It is, but if you're a website operator, it's much easier to not use Google Analytics than it is to convince your government to clamp down on law enforcement search powers.


Turning this into dichotomy between website operators needing to crusade on government overreach and not using GA has got the be the worst faith argument I've seen in a minute.

This entire premise of this article advantage of people who aren't technically inclined and can't know well enough to realize that reverse engineering GA fingerprints is not even close to how the government would de-anonymize visitors.

If your local government decides to overreach and find out who's getting abortion pills, Google Analytics will not even be your 1000th biggest problem. ISPs will readily share which sites people visit, the stores themselves will get leaned on, your mail carrier with share where you get packages from.

I mean did you even read the article? "Abortion Ease, BestAbortionPill.com, PrivacyPillRX, PillsOnlineRX, Secure Abortion Pills, AbortionRx, Generic Abortion Pills, Abortion Privacy and Online Abortion Pill Rx."

What do you think HTTPS is going to do when one of these urls shows up in your traffic, email inbox, and a reverse mail address look up leads to one of these?


> Turning this into dichotomy between website operators needing to crusade on government overreach and not using GA has got the be the worst faith argument I've seen in a minute.

Putting words in my mouth has got to be the worst faith argument I've seen in a minute.

> If your local government decides to overreach and find out who's getting abortion pills, Google Analytics will not even be your 1000th biggest problem. ISPs will readily share which sites people visit, the stores themselves will get leaned on, your mail carrier with share where you get packages from.

And it's much easier for website operators to stop using Google Analytics than it is for them to educate their customers on using VPNs and mailing their packages to dead drops (let alone for their customers to actually do those things).

This ain't about dichotomies or the lack thereof. This is about what people can do now to mitigate low-hanging fruit. Nobody said the solution is only to stop using GA; literally all that was actually said is that not using GA is something that the operators of such stores can trivially and immediately do.

That is:

> What do you think HTTPS is going to do when one of these urls shows up in your traffic, email inbox, and a reverse mail address look up leads to one of these?

The existence of problems outside your control does not erase the existence of problems entirely within your control. Website operators cannot force you to use a VPN or a secure email or an anonymous address. They can minimize the data of yours they're sending to third parties.

> I mean did you even read the article?

I mean did you even read the HN guideline specifically prohibiting such a question?


> Putting words in my mouth has got to be the worst faith argument I've seen in a minute.

"it's much easier to not use Google Analytics than it is to convince your government to clamp down on law enforcement search powers."

You literally wrote a one sentence comment that compares a decision for website owners to choose between not using GA and convincing their government not to clamp down on law enforcement search powers. I didn't put a single word in your mouth in stating what I did.

You wanting to walk back what you wrote doesn't make my comment put words in your mouth, they were the words you said.

> And it's much easier for website operators to stop using Google Analytics than it is for them to educate their customers on using VPNs and mailing their packages to dead drops (let alone for their customers to actually do those things).

GA is completely orthogonal to the entire discussion. That's the point that ProPublica (intentionally?) ignores, and apparently you're just unaware of.

It's doing literally nothing to protect or harm these people, because the government is not stooping to deanonymizing GA data when there are 1001 more direct and better established methods to achieve the same thing, it's a frankly absurd point.

Saying by not using GA they're even tangentially protecting customers shows a complete lack of understanding of why this government overreach is such a problem.

-

tl;dr/too technical;didn't understand: The government doesn't need your GA fingerprint they can get the things that the fingerprint is made and then some straight from Google... and your ISP... and the site's host... and the mail services.. and the list goes on.

They can literally ask for all people who searched for a given term in a 5 block radius and you're trying to talk about hashed fingerprints???

It's like worrying that the government is going to check for your DNA in the toilet at a local restaurant when the establishment can be compelled to give them a receipt with your card details, your bank will give them the transaction details, your search history with the restaurant name is up for grabs, the municipal security cameras that watched you drive up are up for grabs...

-

Once you understand that then the pointlessness of this line of reasoning comes up, and why ProPublica is doing this becomes more questionable.

By inventing some totally ludacris wrong, they're painting themselves as having uncovered some unique in-depth aspect to the dynamic between these abortion pill sites and their users, but to do so they're painting the sites as negligent with the most inane stretch of logic possible.

Instead of focusing more on the actual problem, the overreach, they create a new boogeyman because it gives their reporting a unique angle. But of course that boogeyman is serving the interests of people who are having their rights stomped on.

By completely misunderstanding the topic (and in ProPublica's case I'm not buying it was unintentional) both you and the article are just throwing FUD into the actual conversation that matters.

It's annoying to see supposedly creditable publications intentionally muddy things for their own benefit, and it's even more annoying to see people with a poor grasp of the situation just run with it blindly without taking 5 seconds to apply critical thinking and context to it all.


> You literally wrote a one sentence comment that compares a decision for website owners to choose between not using GA and convincing their government not to clamp down on law enforcement search powers.

Right, and note what I didn't say:

- Whether or not those things are a dichotomy or otherwise mutually exclusive

- Whether or not those things encompass the complete set of privacy violations or the mitigations thereof

Your assumption that I've made or even implied answers to either of those within the words "it's much easier to not use Google Analytics than it is to convince your government to clamp down on law enforcement search powers" - and then arguing against that assumption - is where you're putting words in my mouth. It's also what makes you accusing me of bad-faith argumentation or a lack of critical thinking or context hilariously ironic.

There's nothing for me to "walk back". You blew up at me for merely suggesting that businesses which should be valuing their customers' privacy can take very easy steps to actually signal that. Whether or not they're siphoning a bunch of data to third parties is a rather strong signal of whether or not they take their customers' privacy seriously, and the fact that you are not only incapable of understanding that concept but feel compelled to resort to unwarranted hostility and personal attacks in response to it speaks volumes.

> GA is completely orthogonal to the entire discussion.

GA is literally the context of the discussion. Just because there are other ways for others to violate your customers' privacy doesn't mean it's okay to willingly and deliberately violate your customers' privacy yourself. That you not only fail to understand this but are needlessly hostile to those who do understand this speaks volumes.

If you'd like to have an actually intelligent and civil discussion instead of angrily flinging insults at me, I'd be happy to oblige. Until then, have a nice day - hopefully better than whatever tragedy you're choosing to take out on me.


When you started this comment by saying "Yeah I created a division between those two options but that's not a dichotomy!" I questioned if I should bother reading

By the time I got to the invention of "angrily flung insults", and projection about bad days I had my answer...


Every company in healthcare (and many other industries where privacy is important) needs to know that they should not be giving customer data to third parties.


The state controlling reproduction is the alarming bit for me, closely followed by the surveillance creepiness Google enables.


Law enforcement searches aren't the only risk here.


After a Minnesota lawyer reported his neighbor for allegedly sexually assaulting his son, that neighbor cracked the lawyer's wi-fi WEP encryption and proceeded to attempt to frame him for CSAM crimes, sexual harassment, and threatening of politicians. The lawyer's employer hired an outside firm to investigate, the Secret Service showed up, and ultimately a search warrant at the neighbor's home found evidence that he was the true culprit. He was given 18 years in prison.

[1] https://www.wired.com/2011/07/hacking-neighbor-from-hell/


From earlier this year "Wife Framed Husband By Planting Child Sexual Abuse Images On Phone: Police" (https://www.msn.com/en-us/news/crime/wife-framed-husband-by-...)


Jeez, imagine if the guy didn't have the support of his employer.


My house was blurred many years ago and I wish it wasn't. It almost draws more attention when you look at Street View, which is probably contrary to privacy interests. I've also had odd reactions from companies I've called out to do maintenance or yard work. Some of them never show up, and I suspect it is at least in part because they draw some negative inference from the blurred house. I'm also thinking about selling the house soon, and I just know it's going to be problematic. I've tried to get it unblurred several times, but have failed to get a response from Google.


>My house was blurred [...] It almost draws more attention [...], which is probably contrary to privacy interests.

You're commenting on privacy when you're addressing covertness.

Privacy/Secrecy qualities addresses observability.

Secrecy: information desired to never be known (eg Location of your savings)

Privacy: information desired to be selectively known (eg a medical issue)

Covertness: information desired to be unsurprising (eg the thickness of your wallet being a side channel to your wealth)

Covertness is context dependent and in some contexts, a highly covert entity can actually be less private than a highly private entity. Walking around town in a ScrambleSuit will make you more private but less covert. Getting the same haircut, clothes, gate of people in your town will make you more covert but less private.


I came here just to write about that. I don't have mine blurred, but I thought about it, that it'd be drawing more attention if it was.. When 50% is blurred, maybe..


Ah, the Streisand effect.


For travel, I'm happy with the Peak Design 45L Travel Backpack. It's optimized for photographers, with separate packing cubes that clip into the bag. https://www.peakdesign.com/products/travel-backpack

Their Tech Pouch in particular is an amazing design, and has organized all of my random cables and adapters. https://www.peakdesign.com/products/tech-pouch/

For just around town, I like Timbuk2 messenger bags. Sadly, they stopped making the Especial messenger bag, which was the best product they've ever made.


I'm excited about this (and all things Sketch-related), but I thought this was a product release from the makers of Sketch until I saw the disclaimer at the bottom.

$39 seems fair, though I'd rather see this as an open-source project that the community could contribute back to. According to the changelog, this is a v1 release and the website contains no statements about the $39 buying future updates for new features or updated versions of Sketch, which is disappointing.

Aside from a few screenshots, it would be nice to have a YouTube video demonstrating how it works in practice, in real-time, especially considering the terms stating: "As a customer you are responsible for understanding this upon purchasing any item from Sketch Design System".

I wouldn't buy it without more information, and knowing who is behind it. There's no "about" section on the website with a real person identified. The domain WHOIS data is a proxy service.


I'm pretty sure that this mixup is intended. This website trying to look like an official Sketch product so that they can capitalize on Sketch's successes.


Here are some random observations and opinions.

1. The screenshot at the top of the page seems to show some features, but I have to infer what they are. It would be good to automatically cycle through annotations to show what the little "Expert" badge and all the other features mean. Sell your product in a captivating way. The headline "Become a PROFESSIONAL freelancer" is a good thing to A/B test, or to have rotate through words, e.g. "Become a SUCCESSFUL freelancer," etc.

2. The demo video is way too small. Make it much bigger so users don't have to squint. It's also too long. There's a time and place for a nearly five-minute demo video, but not this soon in the process. 30-45 seconds max.

3. The NDA feature is overhyped. I care much more about defining the scope of work and getting my clients to agree to my master services agreement or other contract. Does the product generate an agreement? Can I make a template for my agreement? More detail would be helpful.

4. I don't find value in the personal assistant feature, personally. I want to maintain direct contact with my clients once we make contact. It actually freaks me out to have someone else talking to a client and potentially making promises I don't agree to, or not behaving the way I'd expect.

5. As I scroll down the page, I'm not actually presented with a big call-to-action to start the signup process until I reach the bottom. This adds way too much resistance. Also, make the signup button green, or consider A/B testing signup button color.

6. The signup process is difficult. Instead of just asking for a couple quick details (e.g. email address and name), it opens a modal window that presents huge resistance. There's a splash screen adding yet another step to the process (you have to click TWICE to reach any form inputs).

7. The form is a Typeform full-screen modal form with a "0% completed" label. I hate Typeform forms, and I've found them to perform poorly in my own experience. For a signup process, seeing "0% completed" is a huge mental barrier. I immediately think "ugh, this is gonna take forever, I'll do this later" and I might not come back.

8. I'm not told before or during this signup process whether I have to pay any money or what I get for free versus for pay. Use the signup process as an opportunity to reinforce features and benefits. Is there a trial period? I have no interest in filling out this complex form if I'm going to have to pay right away. I need to be able to take it for a test drive.

9. You force users to provide a LinkedIn URL. Not everyone uses LinkedIn, for good reason. If you don't have a LinkedIn, you cannot proceed. You're killing signups.

10. The signup process doesn't actually work. It appears all it does is email you. So, it's actually a "contact" form and not a "signup" form. Upon submission, it says "Thank you! We'll get back soon. If you have any questions reach out to [email]." You've now dead-ended your user that is interested in your service. I doubt many will come back when you "get back soon" to them. Make sure you always give users a path forward, without manual intervention from you. Your post says, in all caps, "NO ONE PAID" -- well yes, you literally do not collect payment information or give users a path forward to payment.

11. Footer says "Use of this site constitutes acceptance of our User Agreement and Privacy Policy". But you have neither. There are no links to a user agreement or privacy policy. This suggests to me that maybe I need to be concerned about how good the onboarding process is. This might seem like a small detail, but the site is marketing itself to web developers and engineers, who will notice these details.

12. Show success stories somewhere. Seeing an HBO Silicon Valley character screams "we don't have any users." Use real stories of people who have enjoyed using the service.

13. You mention in your post an "Elite 100" program where you give people shares. I don't think this is compelling for most users, but perhaps it's worth testing and experimenting with. Currently you appear to not advertise this anywhere. Your ProductHunt appears to include a link to it (/100), but that redirects to the homepage. If the program isn't detailed, it doesn't exist.

14. Site needs proofreading throughout.

15. Pricing says $29 per month and then right below that says $10 per month. Which is it?


Thanks for taking time to write such a detailed feedback. Some of the errors happened because we were updating in real-time with the feedback from HN :| The Elite 100 program was taken down yesterday due to low interest, I've briefed about it in another comment above. Thanks again! :)


This is valuable feedback.


The author used `tcpdump -i lo0 -s 65535 -w info.pcap` which, as a non-root user without sudo, successfully captures loopback traffic in OS X 10.11.3.

I just tried it, and with Chrome and 1Password, I was able to see my auto-filled bank password in the pcap. So, I presume any process on my system, without root privileges, would be able to sniff loopback.

I don't see why 1Password wouldn't use TLS here. This is not good.


Your system is misconfigured.

    > $ tcpdump -i lo0 -s 65535 -w info.pcap                                                 
    tcpdump: lo0: You don't have permission to capture on that device
    ((cannot open BPF device) /dev/bpf0: Permission denied)


I'm on OS X 10.11.3:

tcpdump -i lo0 -s 65535 -w info.pcap tcpdump: lo0: You don't have permission to capture on that device ((cannot open BPF device) /dev/bpf0: Permission denied)


This is a fresh OS X install on a test machine :/


I don't know what to tell you. Normal users can't tcpdump loopback on OSX, or anywhere else.

    > $ ls -l /dev/bpf*                                                                      
    crw-------  1 root  wheel   23,   0 Feb 29 07:59 /dev/bpf0
    crw-------  1 root  wheel   23,   1 Feb 29 07:59 /dev/bpf1
    crw-------  1 root  wheel   23,   2 Mar  2 11:11 /dev/bpf2
    crw-------  1 root  wheel   23,   3 Mar  2 10:07 /dev/bpf3
    crw-------  1 root  wheel   23,   4 Feb 29 08:11 /dev/bpf4


Works for me too on OS X. sudo is not needed to run tcpdump for any interfaces.

$ ls -l /dev/bpf*

crw-rw---- 1 root access_bpf 23, 0 Mar 1 09:18 /dev/bpf0

Edit: Wireshark is installed


Did you install Wireshark? Did you let it reconfigure your system? Is your current user in the "access_bpf" group?

Later

Yes. Your system is misconfigured. Don't let Wireshark do that.


It looks like Wireshark will happily keep your system permanently misconfigured. To fix it, disable

/Library/LaunchDaemons/org.wireshark.ChmodBPF.plist

This actually seems like a much crummier thing than the 1Password non-thing.


    $ tcpdump -i lo0 -s 65535 -w info.pcap 
    tcpdump: lo0: You don't have permission to capture on that     device
    ((cannot open BPF device) /dev/bpf0: Permission denied)
Looks like you're logged in on a superuser account or have otherwise somehow disable some security settings.


I also can't access loopback on 10.11.3, I get this exact error. And I'm running as an Administrator account.


Yeah, it's that they installed Wireshark, and gave it privileges to chown the loopback interfaces.

edit: Irony here is that Wireshark is doing something far more dangerous than 1password.


It's either a) change the group on the /dev/bpf entries and add your user to that group or b) run Wireshark as root.


b) would in general be a lot safer, in that you're elevating one process rather than lowering a privileged interface so that every process you run can sniff it.


Correct - Least Privilege says you do the absolute least you need to do in order to make things work, so that any errors are limited to that one part of the system.

What's been done here by Wireshark isn't least privilege, or secure. Its like the opposite of least privilege and security.


On Linux you can give an executable admin access to network devices with setcap which narrows it down further. Is the same possible on OS X?

Edit. Actually this is worse than running as root isn't it!


setcap is in principle better than setuid if your program is something like ping. Or in this case, wireshark.


Don't be so quick to dismiss this one. This is notable for several reasons, including (1) it wasn't his mistake, but rather that of a Visual Studio bug not following his instructions to make a private repo on GitHub; (2) the speed in which this happened (minutes), and (3) it has useful analysis into some sorely lacking functionality in AWS that lets this continue to happen.


I don't know. Even for private repos, it's bad practice to commit private keys to source control.


> Even for private repos, it's bad practice to commit private keys to source control.

Is it, though? Committing them to the same repository as one's code lives in, sure, but committing them to a separate production-deploy repo seems okay to me (although I'd much prefer that private repo never to hit a centralised service like GitHub).


> Is it, though?

Yes, it is. Period. Full Stop. Don't ever check access keys into any repo, public, private or even self-hosted. AWS needs to do a better job of making you realize that access keys are like 100-year-old sticks of dynamite and should be handled with an equal amount of care considering they can cause a similar amount of damage.

To their credit, they basically noted this when they changed the way access keys are handled in the command line tool[1]. Quoting:

  An important point is that the default location for the credentials file is a user
  directory. It's no longer part of a project file structure, such as an app.config file
  (.NET) or .properties file (Java). This can enhance security by allowing you to keep the
  credentials in a location that's accessible only to you, and it makes it less likely that
  you'll inadvertently upload credentials if you upload a project to a developer sharing
  site like GitHub.
[1] http://blogs.aws.amazon.com/security/post/Tx3D6U6WSFGOK2H/A-...


> Yes, it is. Period. Full Stop. Don't ever check access keys into any repo, public, private or even self-hosted.

Generalizing here (and for the record I agree with your point), is that to a developer, source code is as valuable as anything else, and if a private repo is secure enough for source code, it should be secure enough for access credentials.

I speculate that this is a by product to moving towards a developer-centric engineering culture.


> Don't ever check access keys into any repo, public, private or even self-hosted.

So, how do you propose to handle version control of your exact production configuration? Do you think that recording one's production configuration is a valuable (or even useful) technique? Do you propose to have a single file with production configuration relying only on point-in-time backups for history? That gets us back to the bad old days…

A local or self-hosted repo is 'a local location that's accessible only to youation that's accessible only to you.'


> So, how do you propose to handle version control of your exact production configuration?

Version control is perfectly acceptable for that, but the key point is that access keys aren’t part of your production configuration. I personally don’t allow my team to have long-lived AWS access keys that have access to production. Production is only changed via an assumed role requiring MFA or an IAM instance role.

If this kind of setup is too complex to create on your own, solutions like Vault (https://vaultproject.io) can make it much easier to get right.


> Version control is perfectly acceptable for that, but the key point is that access keys aren’t part of your production configuration.

Why not? If your production configuration is stored as static-file in version control, then either those credentials have to live in that version control, or they have to live outside it, as a separately-backed up file of configuration variables, which are then fed to a static-file-generating tool, which means a) that one has to select a templating system and maintain those separate templates and b) that there's another step involved in developing the prototypes for those templates. This isn't a bad course of action, of course, but it does have its costs (just like storing production configuration—including credentials—in version control has its costs as well as its benefits).

> I personally don’t allow my team to have long-lived AWS access keys that have access to production.

Of course! Developers shouldn't have access to production systems, period. But operators can, and must, and I've no problem with them having access to their own version-controlled repository of information that no other team can access.

> Production is only changed via an assumed role requiring MFA or an IAM instance role.

Of course! I'm certainly not arguing for personal credentials to be stored in an operations repo.


I use Ansible Vault if it's imperative to store keys in a repo, and I've been really happy with it. StackExchange has something similar called Blackbox[0]. I remember reading about a Ruby gem too, but I can't find it now.

[0] https://github.com/StackExchange/blackbox


> https://github.com/StackExchange/blackbox

That looks pretty cool, at least at first. I'll have to take a deeper dive into it.

I like that it looks to be all shell scripts. Wish it were sh rather than bash, but c'est la vie.


AWS SDK for .Net supports named profiles for VS so your source check in only contains a profile name. If you hard coded your IAM or account keys or stuck them in the app.config you're simply doing it wrong. There is no excuse. It's all here: http://docs.aws.amazon.com/AWSSdkDocsNET/latest/V3/Developer...

Also if you use a proper IAM profile locked down to specific resources then you wouldn't expose your entire account. The author stated he didn't use EC2 so why wasn't the key/secret pair an IAM account with a policy set for minimal access?

What I'd worry more about is that the VS bug exposed private source code, data and proprietary intellectual property.

We use github and I worry every day someone will public fork one of our repos by accident. That would be a grave fuck up but it's waiting to happen. We should have stuck with centrally controlled active-directory integrated SVN from a security perspective (even if it is a pain in the ass).


Well, it is partially his mistake, which he acknowledges:

> I am certainly not innocent here and some mistakes were made on my part. [...] To this end, having encrypted access keys or excluding configuration settings from GitHub would have prevented the AWS charges - and this is certainly the approach I would take from now on.


In this case it wasn't Visual studio but an extension created by Github that caused the problem.


I love this concept, and it is sorely needed. I read in your Medium post[1] that “Data is wiped from the phone as soon as it’s securely moved to the Witness servers.” I'm curious what the reasoning for this is. Of course, I get that the data is streamed/copied to Witness in the event that the device is destroyed or confiscated, but it's hard for me to think of a use case where having a local version on the device would would do any harm.

In fact, I think it would be valuable for evidentiary purposes to have the original on the device. I'm assuming that the streamed/copied version is probably lossy in some regard, while the local version might be higher resolution or frame rate, also.

[1] https://medium.com/@marinosbern/witness-livestreaming-for-em...


I just watched this documentary "Point and Shoot" [1], the guy filmed all over Africa, including Libya, and at one point was taken prisoner where he said his footage could have been used to identify other members of the Libyan resistance. It isn't hard for me to imagine a situation where you don't want to give the authorities visual evidence of who was at an event or who you were standing with.

1: https://en.wikipedia.org/wiki/Point_and_Shoot_(film)


Thanks for pointing this out. The original thinking was twofold: 1) I didn't want people to get in trouble if their phones were searched and they were in possession of a recording and 2) Keep the data stored on the phone to a minimum, in order to increase available space for storing video when Cellular Data/WiFi is not available. The data is always available later through the shortlink.

You raise a very good point though. Offering an option to keep data locally (or save it in the camera roll) in a future version would make sense


Might be worth clarifying that the project is separate from the work of the well known human rights and video NGO called "Witness" (www.witness.org). They make a number of video tools so a user might be confused.


At the border US customs ask you whether you have "accounts on remote computers", which means, practically everything. It's better to move the data offsite so police actually needs a warrant to inspect it (instead of a "I thought he had a gun" excuse), but it can still require them to provide access to the remote account to the police.


Interesting. Have you been asked that before? I've done a fair amount of going in and out of the country and was never asked


> but it's hard for me to think of a use case where having a local version on the device would would do any harm.

Hostile environments. It becomes incriminating evidence against you (think recording police brutality or excessive uses of force).


There's another strong reason not to trust Evernote: nothing is encrypted at rest on your system or their servers, unless you manually go encrypt selected text in a specific note in the desktop application, where you have to make a new passphrase for each note you want this on. Even this was only recently upgraded from RC2 64-bit to AES 128-bit.

They claim they can't perform searches over encrypted data, but that doesn't seem too difficult to solve with an index file that's also encrypted.

Evernote does have some detailed security policies and 2FA, but without encryption at rest where only the user has the key, what's the point?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: