00:00:16.000
my name is R Henrick and I'm going to be talking about Security today and specifically about why you uh as an
00:00:22.519
organization and as a developer I think should make security your your number zero priority uh when writing software
00:00:29.679
so this is this is me I'm I'm on the internets in various places and
00:00:35.440
obligatory uh warning that I work at Living Social and we're hiring uh and before I get started I'd
00:00:41.440
like to make a few uh qualifications that I think are necessary and also uh a
00:00:47.719
few thank yous that I think are in order uh first I'd like to thank the interpreters for being here I think it's wonderful and I hope to make their their
00:00:54.520
job very difficult by using a lot of large uh jargony words that they'll have to spell out
00:01:03.039
and I also want to thank all of you for coming here uh it proves that despite what the python Community says Reby
00:01:08.320
developers do in fact care about security uh so I do have the caveat that
00:01:13.640
this talk will not make you good at security if you're not already good at security uh there's just too much to
00:01:19.000
cover and I don't have enough time but I I hope that it will make you care more and that you will use that resource of
00:01:25.759
attention and you'll spend it on security uh
00:01:31.079
because the most important thing is to find out what is the most important thing and and today I'm going to convince you that that's
00:01:37.920
security so before we talk about why security is priority zero let's talk
00:01:42.960
about what security is and there's some misconceptions that I'd like to clear
00:01:48.159
up is security a feature is security something that you
00:01:53.360
can bolt on at the end of a development process to make what you've already done
00:01:58.479
secure no it's not and historically speaking that style of doing security
00:02:04.479
doesn't work very well it doesn't lead to very secure products and the reason for that is that security is actually a
00:02:11.879
subset of reliability it's really the core of reliability because any security
00:02:17.280
failure is also reliability failure if you lose data if your data is
00:02:22.720
compromised that's a reliability failure but not all reliability failures are also security failures so Ryan Tomo said
00:02:29.480
in one of his talks that reliability is priority zero and so I I would say that
00:02:35.879
that makes security PRI priority zero or and there's another misconception
00:02:42.680
that security is a static property of a system that once a system becomes secure
00:02:48.760
it stays secure and a software system is actually a very complex organization and
00:02:56.840
so security is actually an emergent property of systems both how the system interacts with the
00:03:02.239
environment around it because a system can be secure in one environment for instance an internet behind a firewall
00:03:09.480
and insecure when moved into a different environment and two different systems or components can be secure in isolation
00:03:16.720
but can become insecure when combined together so security is not a static
00:03:22.680
property of systems it's an emergent property uh and the other misconception
00:03:27.720
is that security is a that security is something that you can buy off the shelf
00:03:33.840
that security is something that someone can put into a box and sell to you and Bruce schneer who's who's a very famous
00:03:40.200
uh cryptographer and security person has a quote that I love which is that security is a process and not a product
00:03:48.000
and what that means is that security is a process of continuous improvement over
00:03:53.519
time that has a Cadence of Discovery assessment mitigation validation more
00:04:00.480
Discovery more assessment and so on it's an it's a continuous struggle so so now that we've talked
00:04:06.879
about what security is let's talk about why we should make security a priority for starters security is your
00:04:14.360
CEO's top priority whether they know it or not and they'll know
00:04:20.040
it when they have a security breach they'll be very well aware of how
00:04:25.639
for the past two years it should have been a security a top priority when it's too late to make it
00:04:31.960
one so executive support is necessary to affect change in an organization and
00:04:39.080
having executive support for security is necessary because that trickles down and is what empowers developers to make
00:04:45.520
their software secure but that's not enough let's let's talk about you as developers and what your
00:04:50.960
responsibilities are for making software more secure first of all you need to be
00:04:56.800
aware that security is your responsibility that it's a place where you need to devote some of that pre
00:05:03.240
precious resource of attention because developers are the only ones that can make software secure
00:05:08.919
your manager can't do it your CEO can't do it you can do it and you're the only
00:05:14.120
ones most software security problems are caused by bad software go figure and
00:05:20.080
developers are the ones who are best suited to think about that software and change it so let's also take a moment to
00:05:26.759
understand what are our limitations as humans and why security is so hard Bruce
00:05:32.800
wrote a great book called secrets and lies uh if you want an overall sort of
00:05:38.120
30,000 foot view of what security is what security does and why it matters
00:05:44.319
you should get this book uh all of the books that I'm referencing here will be on my talk on um speaker deck as
00:05:53.360
hyperlink so you can click through uh so one of the problems one of our limitations is that humans are just very
00:05:59.680
very very bad at assessing risk we're very bad at correctly estimating things
00:06:05.560
that don't happen very often we're very bad at correctly estimating the downside potential of those
00:06:12.319
things Bruce Bruce has a quote that I love which is that more people are killed every year by pigs than by
00:06:18.840
sharks which shows you how good we are at evaluating that kind of risk and it
00:06:24.160
also shows you that pigs are scary so I want to talk more about
00:06:30.360
assessing risk in a moment but I also want you to keep in mind that our brains are hardwired to get this wrong so we
00:06:36.880
have to think harder and overcome that liability and that limitation to get it
00:06:41.960
right so let's talk about why security may not already be a priority in your organization let's Identify some of the
00:06:48.319
blockers that are preventing Executives and management from really placing the priority on security that they need to
00:06:55.479
one of those is insufficient funding and resources another is an absence of In-House expertise and another is a lack
00:07:02.879
of an effective security strategy so I don't have time to tell you how to fix
00:07:08.080
all those but be aware that you might run into those obstacles when you're talking to your manager or when you're
00:07:13.919
talking to Executives about trying to convince them that you should spend more time on security and
00:07:20.960
one of the best ways that I've found to make security matter to the people who are you know ultimately the executives
00:07:28.080
are ultimate responsible uh is to make it meaningful to them to talk about the repercussions and the
00:07:35.759
consequences of security failures in using words that make sense in their
00:07:41.319
language using technical words won't really get the problem across correctly
00:07:46.680
to someone who speaks the language of business so let's talk about a a very
00:07:51.759
short and very inexhaustive list of some of the consequences of security failure
00:07:59.919
the first is remediation cost the amount of money you have to spend to make it
00:08:05.000
right and this includes liability for stolen assets or information repairing
00:08:11.400
damage to the system that may have been caused incentives offered to customers or to Partners to win back
00:08:19.360
trust um protection costs include things like making organizational changes
00:08:26.440
adding resources to address the security concern you know like deploying additional Personnel or or
00:08:33.159
Technologies uh training employees bringing in thirdparty experts
00:08:38.440
and Consultants um loss of Revenue is is an obvious one people will stop buying a
00:08:45.160
product that they don't trust in but also unauthorized use of
00:08:51.480
proprietary information uh can also be a source of lost Revenue compromise of brand and identity uh loss of customer
00:08:58.720
trust and Goodwill Goodwill is very easy to lose and very hard to build so you
00:09:04.560
can spend years as a company building the trust and Goodwill of your customers
00:09:10.160
and lose it in in an instance um if you don't allow security be security to be a
00:09:16.320
priority uh finally legal exposure liability to litigation fraud and extortion and other
00:09:23.440
illegal ways of of damaging the company and it's also important to
00:09:28.880
remember that security is a part of a larger picture security actually trades
00:09:34.160
off against other business needs and so you have to decide it's not just about
00:09:41.160
making it as secure as possible it's about making it secure enough for the business so let's talk about some of the
00:09:49.399
the sort of the top level goals for security and some of the principles that we we use to guide how we think about
00:09:57.920
security the first the first principle is that there is no such thing as a secure system I said before that
00:10:04.800
security is not a product it's a process and your goal is to make the system
00:10:10.720
secure enough to meet your business needs now and to be continually improving your security over time um
00:10:18.640
there are a couple great resources oasp is a ginormous
00:10:24.240
Wiki that will go into a lot of detail about many of the things that I'm only going to be able to to skim over in this
00:10:31.320
talk uh John via has written a couple books on security this is one of them
00:10:36.399
it's amazing if you care about security and I hope you all do you should consider picking it up so let's talk
00:10:42.600
about four fundamental pillars of security the first is confidentiality of
00:10:49.079
your business information and your users's data things that are private need to stay private integrity means
00:10:56.240
that things shouldn't change unless they're supposed to availability is I think a term you're
00:11:01.480
you're more familiar with uh it's the the ability of the system to to stay
00:11:06.560
working over time and accountability is the ability to trace an action in your
00:11:11.680
system to a specific user and then hold them accountable for that action so this includes monitoring intrusion detection
00:11:19.440
auditing tracing and that kind of thing so let's talk about some some principles that we
00:11:25.320
use at a higher level to guide our our thought process around security one of
00:11:30.839
one of the most important is that your system is only secure as its weakest link and that your potential attacker
00:11:39.120
doesn't need to break encryption if one of the boxes on either end is
00:11:44.360
vulnerable you know encryption is very hard whacking some with a rubber hose is
00:11:50.399
relatively easy um defense in depth means that you need to have a layered
00:11:55.440
security model so a breach in an outer layer doesn't open up the entire world
00:12:01.160
within so you need a diverse set of defensive strategies uh that that
00:12:08.320
minimize the the impact of that penetration uh failing securely means that if a part of the
00:12:16.399
system fails it should fail in a way that the system maintains its security that the system stays secure so for
00:12:23.680
instance a security related operation if it fails if there's an ction it should
00:12:30.040
cause the system to behave as if the operation had been denied so for instance login required in your rails
00:12:36.959
app shouldn't return true if it raises an exception uh the next three are a
00:12:42.760
positive security model which means that we should prefer white lists to Black lists in rails for
00:12:49.440
instance we should prefer Adder allowed to Adder protected and the reason for
00:12:55.040
this is that defining what is allowed rather than what is disallowed limit our Suess susceptibility to new attacks and
00:13:03.320
or unanticipated inputs because there's an infinite variety of inputs that we can't protect
00:13:10.160
against and we can't protect against all of them um secured by default means that your your system should come with the
00:13:17.240
most secure settings enabled by default this means in a rails app that csrf protection is turned on that that
00:13:24.839
cookies are encrypted all of the default things should make your system more more secure and not less
00:13:31.760
secure and principle of lease privilege means that you should only Grant the minimum level of access necessary to
00:13:39.079
perform an action and unix's uh not fine grained access
00:13:45.120
controls make that difficult but you should still try to separate users and groups and make sure that you give each
00:13:53.199
operation as few privileges as are necessary to actually get get it done so detecting intrusion
00:14:00.079
obviously you need to monitor your system using things like uh checking for statistical anomalies and other ways to
00:14:07.360
if someone is attacking your system you need to be aware of it compartmentalizing means to isolate a
00:14:12.959
security failure so that a breach in one area doesn't allow the attacker to then move on to other parts of your
00:14:20.680
system uh and be reluctant to trust means that you should assume that the
00:14:25.839
environment your system lives in is a hostile one and an inse pure one and you should minimize the amount of trust that
00:14:33.079
your system gives sorry that your system gives to other parts of the environment and to other components within that
00:14:41.800
system and finally keep it simple the fewer components a system has the less
00:14:49.199
opportunities there are for you to get something wrong now some of you might be thinking back and saying well defense in
00:14:56.240
depth and compartmentalization seem to imply increased complexity and I would say two things I
00:15:03.360
would say shut up and you're right um some of the principles in in here are
00:15:09.839
intention with other principles so making a system more compartmentalized is intention with keeping that system
00:15:16.600
simple so be aware that you are trading off to follow these principles so let's
00:15:22.000
talk about let's talk about the biggest issue in security and the hardest issue in security which is managing
00:15:30.000
risk managing risk is the way that we identify potential problems and decide
00:15:36.519
on a rational strategy to address them the goal of risk management is to accurately assess risk so that you can
00:15:42.880
make well-informed business decisions about how to mitigate it so risk to Define it is an expected loss of one of
00:15:51.079
the four pillars confidentiality Integrity availability or accountability and there's a book about
00:15:57.160
this so you know it's good uh risk tolerance first off you need to identify your organizational
00:16:04.399
level of risk tolerance what is your appetite for risk and more specifically what is your CEO's tolerance for risk
00:16:11.759
that's where the buck stops once you determine what your Baseline level of of risk tolerance is you need to start modeling potential threats in your
00:16:18.759
system you need to have an understanding of what are the vulnerabilities and threats in your system a threat I'm
00:16:25.480
sorry a vulnerability is is a flaw or weakness in the systems design implementation operation or management
00:16:31.959
and a threat is a possible danger like an external attack that might exploit a
00:16:37.360
vulnerability so you have to have a threat and a vulnerability come together for there to be a problem a threat in an
00:16:45.199
area where there's no vulnerability a dangling threat is not a risk and likewise a vulnerability that no one can
00:16:52.120
Target that's not threatened is not a risk as well so when doing this threat modeling
00:17:01.120
it's important to think like a hacker to put yourself in the mindset of someone who's trying to break into your system
00:17:07.720
and one of the the non-obvious parts about this is
00:17:13.439
that hackers will spend a lot of time or willing to spend a lot of time doing
00:17:19.240
things that you think are stupid so if you ask a lot of
00:17:25.600
organizations like Microsoft for instance for a long time it was possible to exploit word just by creating a file
00:17:33.320
that looked like it was in the right format but had some random bits at the end and word would try to process it and
00:17:40.720
it would crash and they asked the Microsoft Engineers about this and they said why would
00:17:46.240
anyone create a malformed word do no one would why would anyone do that no one
00:17:52.480
would that's stupid no one would do that so that their failure to think the way a hacker would means that they weren't
00:17:58.320
aware of of that Potential Threat and and that vulnerability which seemed like a dangling vulnerability to them that
00:18:04.320
wasn't a real risk turned out to be a real risk so the the actual
00:18:09.760
process of modeling threats one of the best processes for modeling threats is to use what's called an attack tree
00:18:17.240
which is to start at the high level of the route with what the attacker's goal is and then to Branch down into possible
00:18:24.919
ways that they could Implement their goal so I'll give you a good example of how to intercept information over a
00:18:31.880
secure SSH connection this is the kind of thing and it actually three four and five and
00:18:38.320
whatnot are expanded as well into very large so you know if you do threat modeling property there's a lot of work
00:18:44.600
to it I don't want to give you the impression that the stuff is easy because it's not so there are a lot of
00:18:49.640
ways that you could do this you could break the encryption you could break the RSA key encryption but that's hard
00:18:54.960
that's pretty hard uh you could obtain a key you could obtain the private key from the user VI a number of methods you
00:19:01.520
could break into their machine you could get physical access to the computer you could compel the user to give it to to
00:19:07.360
you by blackmailing them or beating them with a wrench or tricking them into giving you the private key and the same
00:19:13.400
you would also need their password uh you could attempt to put
00:19:18.440
yourself in between the client and the server in some way that you could perform a man in the- Middle attack you you could attempt to subvert the server
00:19:26.000
so that the client thought it was connecting to the real server but in fact was connecting to your own server because people generally don't pay
00:19:32.000
attention to warnings and when they see that the host has changed they'll go yeah whatever I don't I don't care just
00:19:37.240
just let's keep going so there there are a lot of ways that if a if an attacker wanted to intercept this communication
00:19:43.799
they could do it so you have to make this tree this list of all of those things this is Bruce Shan talking about
00:19:51.039
attack trees if you if you want to research this more and after you make this tree and you find out all of these
00:19:57.360
these implementations of the attack you have to start assessing each one in terms of the risk it poses to your your
00:20:03.760
system so let's talk about risk assessment because this is the the part that people this is of the two parts in
00:20:10.200
security I think this is the second hard part the first hard part is figuring out what all the risks are in your system
00:20:15.360
and the second part is figuring out how risky they actually are so the the
00:20:20.440
qualitative way to do this doesn't work the quantita the quantitative way to
00:20:26.440
assess risk is to create risk ratings is to assign each each
00:20:32.640
threat a rating that talks about how impactful it is on your system and there
00:20:38.679
are a couple of ways to to calculate this rating and one of the simplest is to multiply the expected impact of this
00:20:46.080
event in terms of lost revenue or legal liability by the probability of it
00:20:53.120
occurring which is a pretty simple metric there are more in-depth metrics like Dread
00:20:59.559
uh damage is how bad would an attack be reproducibility is how easy is it to
00:21:05.200
perform this attack I'm sorry to to continue to reproduce this attack once they discover it how easy it is to do it
00:21:11.880
again exploitability is how how hard is it to to to exploit this in the first place do you have to run a script or do
00:21:19.120
you have to spend hours doing it affected users is what is the the breadth of impact of of this threat and
00:21:26.919
discoverability is how hard would it be for an uninformed person to learn that this is a vulnerability and a threat so
00:21:33.520
you you can assign numbers uh for each add them all up and you get a rating
00:21:38.799
once you have rated all of these threats you want to prioritize sort by
00:21:43.840
rating and then you want to start addressing them and there are essentially four ways
00:21:50.880
to address risk you can accept the risk you can say we're okay with this the way
00:21:58.360
it is and we're not going to change it for instance credit card companies deal with fraud but they do it
00:22:06.320
in an interesting way credit card companies could make fraud happen a lot less if they wanted to but they don't
00:22:13.480
because the fraud cost them less than the operational cost of preventing that
00:22:19.120
fraud so they make a very calculated business decision to accept that
00:22:25.880
risk they mitigated just enough that that risk becomes acceptable for them and then they accept it avoiding a risk
00:22:33.080
means don't do that thing that was putting you in the position in the first
00:22:38.200
place transferring a risk means making it someone else's problem my personal
00:22:45.200
favorite a good way to do this uh and a good example of this is PCI compliance
00:22:50.799
don't let those credit card numbers into your system then your system is not vulnerable use use a service that allows
00:22:58.279
you to do an encrypted sort of back Channel Direct from the browser via JavaScript or or posting a form and
00:23:05.919
redirecting back they're brain free and and some other people do this so that the credit card data that private data
00:23:12.640
never enters your system your system is it never becomes vulnerable PCI compliance then becomes here's a self
00:23:19.919
assessment checklist and you check off four boxes and you say I'm compliant now
00:23:25.720
so transferring risk if possible can be a solution and finally if you can't do any of the other three you have to
00:23:31.600
mitigate it somehow and we're going to spend most of the rest of this talk
00:23:36.720
talking about how to limit your exposure to risk the first and most important way to
00:23:42.400
do that is to design your system for security from the very beginning design
00:23:47.520
it from the ground up so that your system is secure make it a part of your everyday thought process make it a part
00:23:53.360
of your code reviews consider things like inputs and data flows and how they
00:23:58.559
can affect your system users roles and rights and make sure you're doing out permissions correctly and as minimally
00:24:05.000
as possible having proper authentication and authorization for your users validating the trust relationships
00:24:12.559
in your system what parts of your surrounding environment do your does your system trust either implicitly or
00:24:19.039
explicitly and what other components within your system to various components trust and is that trust actually
00:24:26.200
warranted and is that trust posed creating vulnerability for you and the
00:24:33.360
second step after making security a part of the way you design software is making sure that you implement that software
00:24:41.520
responsibly try not to write your own uh cryptography algorithms you're going to
00:24:48.480
do it wrong and then I'm going to laugh at you try to use systems that have been
00:24:55.159
proven out there in the wild like SSA to
00:25:01.080
beon hardened systems if you're going to for instance write your own GitHub and you want to
00:25:08.360
start managing SSH access to git repositories and you realize that an authorized key file with 10,000 keys in
00:25:15.320
it is not very performant what you don't want to do is replace sshd that would be the worst thing you
00:25:22.480
could possibly do what you want to do is replace what sshd which is an
00:25:29.559
Authentication Protocol SSH ssh says are you who you say you are replace the part
00:25:35.640
of it that it uses for authorization which is the configuration in authorized keys but keep the actual
00:25:43.360
hard part which is SSH the same and don't try to do it yourself make sure that
00:25:50.440
security is a part of your code reviews the code review is the single
00:25:55.919
most effective touch point you have for including security in the design and imp implementation of your software it's
00:26:02.480
the single best thing you can do is to include Security in your code reviews if you want your code to be more secure so
00:26:09.720
let's talk about we've talked a lot about the importance of discovering
00:26:15.000
vulnerabilities and and assessing threats let's talk about different classes of vulnerability so we can we
00:26:20.480
can see a little bit more about what I mean when I say vulnerabilities there's a book about
00:26:25.840
this also there lots of books uh there are vulnerable
00:26:34.000
designs vulnerable designs are vulnerable because the code can do
00:26:39.720
exactly what it's supposed to and the software will still be
00:26:44.840
insecure vulnerable implementations is is exactly the opposite it means that the design is is
00:26:50.919
good and secure but there's a problem with the implementation that that creates a security risk so for instance
00:26:57.120
buffer Flows In A Lang language that's not Ruby are a great example um
00:27:02.240
vulnerable operations mean that both the design and the implementation are fine
00:27:07.799
but you designed it and implemented it for a different environment and when you
00:27:12.960
put it into the one it's in it becomes insecure also don't forget that our
00:27:18.559
users the people using our software are even worse at security than we are
00:27:23.880
because you've all self- selected to come to a talk about security and and they haven't this is a Twitter account uh that
00:27:30.720
retweets people who post their credit card pictures on
00:27:36.000
Twitter so so people actually do this uh there was a study reported by the BBC
00:27:41.080
that shows that more than 70 people 70% of people would exchange their computer
00:27:46.399
password for candy I will give you chocolate if you
00:27:52.679
give me your password and three out of four people said yes
00:27:59.720
one of my favorite quotes about this and pigs apparently come back into this again I don't know what software people
00:28:05.200
have with pigs uh is that given a chance between a dancing pig and security users
00:28:10.279
will pick that dancing pig every time and Mozilla says this a bit more
00:28:16.480
verbosely um essentially what what Mozilla is saying here is that users don't understand the risks involved in
00:28:23.600
using the internet and so we should as much as possible not rely on their
00:28:29.360
judgment here's an example of how relying on a user's judgment doesn't
00:28:35.919
work no one clicks that no button I want to go to that site you're stopping me
00:28:41.039
stop stopping me they click yes there's a view certificate button but these
00:28:46.360
users don't know what a certificate is it's not providing them with useful information so don't disregard that as
00:28:54.279
good as your your system may be designed and implemented if its usability is a
00:28:59.840
security risk your users some of them will be dumb enough to do something
00:29:05.159
wrong and in fact it's not them being dumb it's them having other priorities and not caring about security which they
00:29:11.799
shouldn't you should care about security they should care about getting their job done so let's talk about some specific
00:29:17.960
attacks or exploits based on these vulnerabilities some specific potential threats the most underestimated threat
00:29:24.399
is social engineering and I love this XKCD and I'll give you a moment to read it if you
00:29:34.159
can because the fact of the matter is when when people want to crack you know systems when they see this really hard
00:29:41.279
thing like RSA they don't think oh what a challenge
00:29:46.480
I'm gonna go break into that they think well I could just beat the guy until he gives me his
00:29:52.080
password um Kevin mitnik who you may know who was at one point the Most Wanted computer criminal in the United
00:29:57.440
States said said that companies can spend millions of dollars toward technological protections but that's
00:30:03.000
wasted if I can just call your tech support and get them to give me what I
00:30:08.360
want so the the second large problem is malicious input little little Bobby
00:30:13.799
tables here this includes things like SQL
00:30:19.120
injection cross- site scripting SQL injection is also an example of a a cross layer vulnerability so
00:30:26.960
when Lobby drop table's name came in as a parameter in the in the in the HTTP
00:30:32.720
request that was not a threat it was a threat once it passed into the database
00:30:40.679
it's the interface layer with the database that turned that into a threat so consider that some threats cross
00:30:45.799
operational boundaries and are not threats in one place but become threats in other places hijacking and
00:30:52.960
spoofing fire sheep lets you use other people's Facebook if they don't connect via HD TPS you know just if you want
00:31:01.480
to um this also includes uh csrf manin the- middle fishing a number
00:31:08.120
of other attacks um so now that we we've got an idea for what the problem space
00:31:14.399
is and how how difficult it is to to plan for security let's also talk about
00:31:19.639
how difficult it is to respond to an actual security incident and there's
00:31:25.559
there's also a book about this and the the the W and Forno book is is an excellent book that answers a lot of
00:31:31.399
questions about incident response and also raises a few of its own like why does a scuba diver have a
00:31:42.279
pickaxe so one day this will probably happen to you someone is going to break
00:31:49.399
into a system that you own and you're going to want to know what to do and here's step one step one
00:31:56.840
is fight your immediate fight ORF flight emotional response and take control of
00:32:03.639
this of the situation and stick to the script that you've spent months and months planning oh you you have that
00:32:09.720
script right you guys have written that script right you have an incident response program don't you yes all of
00:32:16.279
you you do show of hands who has an incident I hear that hand raising is a good way to involve the audience okay so
00:32:21.600
no one does that's good um five of you I think so it needs to handle resour
00:32:27.919
ouring and training needs to handle a plan for communicating needs to handle a standard
00:32:33.279
operating procedure for responding to a wide variety of potential attacks and it needs to be flexible enough to to
00:32:40.600
compensate for the fact that most attacks don't go by the book are not done according to your
00:32:46.559
script uh the the basic incident response workflow
00:32:52.799
flowchart if you will which is from that book and looks okay up there not bad uh
00:32:59.000
is first identify that you are in fact being hacked difficult to do anything if
00:33:04.039
you don't and in fact we don't have good statistics about this but but we're pretty sure that the vast majority of
00:33:11.559
hacks go undetected because if a hacker does a really good job you just you won't know
00:33:17.639
that they were there so spend a lot of time making sure your intrusion detection is as good as it can be once
00:33:25.279
you've determined that there is an attack going on you need to coordinate and start assessing the
00:33:31.559
damage then you need to start mitigating that damage then you need to start
00:33:36.639
investigating what went wrong and learning from excuse me the experience so
00:33:42.760
accountability is the pillar that helps us detect dissect and demonstrate an attack it's the thing that gives us
00:33:48.840
intrusion detection it's the thing that gives us audit logs it's a thing that gives us tracers through the system it's
00:33:54.720
the thing that lets us know sorry who did
00:33:59.760
what and what they did that's critical you have to make sure that as
00:34:06.200
this process is going along you document all the things you're learning about the attack as it's taking
00:34:11.359
place all of the the symptoms you see in your system things that don't make sense things that are outside of operational
00:34:17.720
boundaries and any observations you have make sure that you write these down they may come in very handy later for
00:34:23.639
instance when you're filing this with the SEC or in front of a courtroom
00:34:29.240
um make sure that everyone who can do something about it is available and
00:34:35.760
coordinated together and then you need to start responding you need to start doing things taking steps to fix the
00:34:44.000
problem you need to isolate the systems that you believe are affected and even the ones that you think maybe might kind
00:34:50.040
of be affected any system that you think could potentially be a problem shut it
00:34:55.359
off take it off the network I isolated don't let who's ever in there do any
00:35:00.800
more damage than they already have when win and doubt just nuke that
00:35:09.520
system once you've come up with a fix once you've figured out exactly what was
00:35:16.040
wrong you've come up with a way to fix your system you've tested that fix against all of this data you have about
00:35:22.040
the attack because you collected all that data like I told you to once you've tested the fix and you believe it works
00:35:28.720
what you don't do is you don't patch the system that was running and then boot it up
00:35:34.920
again what you do is you rebuild or replace from scratch every system that
00:35:41.000
was affected and I know what you're thinking you're thinking that's a lot of work and I don't want to do it well
00:35:46.320
tough cookies because if you think that you know how to fix that system in place
00:35:52.760
that disqualifies you from having the necessary knowledge to fix it in place
00:35:58.760
so you need to once you fix the system once the attack is over take all
00:36:05.319
of that documentation that you were writing all of that data that you gathered turn it into a comprehensive
00:36:11.640
incident report file it away somewhere safe and then don't just sit on it use
00:36:19.560
it to learn to educate yourselves and to make your incident response practices
00:36:26.000
better for the next time it happens let's also
00:36:31.400
talk about after the incident happens what do you do then you have to
00:36:39.960
disclose you have to disclose to customers and partners who may be
00:36:45.040
affective affected you may eventually have to disclose to the SEC if you're a public company and it sufficiently
00:36:51.480
impacts your financials you can talk to your CFO about that and they can tell you how fun that's going to be for you
00:36:58.240
um you may have to disclose in court you need to make sure that that data is ready to do so you also have an
00:37:08.000
obligation to disclose to your users and your other partners uh you need to disclose as soon
00:37:14.280
as possible don't wait a week or a month don't wait an hour if you don't have to
00:37:20.200
as soon as you know what's happening and no sooner you need to let you users know
00:37:28.440
that you have put them in danger and that they need to start taking steps to protect themselves because you've just
00:37:35.119
lost you've just lost your ability to protect them so you have to make sure that they can now protect
00:37:41.119
themselves so you have responsibility you need to give them information that they need you need to let them know that
00:37:47.280
their password or their their address or their credit card information may have been compromised whatever it is that you
00:37:53.599
know you have to tell them and you have you you have to do it in a way that
00:38:03.280
both looks at the responsibility that you have to your users first and then looks at the responsibility you have to
00:38:10.480
your company and your organization second so you have to do it in a way that starts to help you rebuild some
00:38:16.720
lost trust as well uh so here are some things you shouldn't do when you're disclosing an incident on your blog or
00:38:22.520
your twitters you shouldn't minimize the impact of the incident to your users a few of you might been expected but I
00:38:28.280
wouldn't worry about it's probably fine you may want to check your balance tomorrow morning but you know it's
00:38:33.760
probably fine um don't pretend that you have suddenly become a security expert right after you got hacked it's not
00:38:40.240
going to fly no one's going to believe you and don't think that this is somehow an opportunity for you to get better PR
00:38:47.599
it's not an ad spot that's not your job right now so what should you say it's
00:38:53.800
very simple what happened to the best of your knowledge exactly what happened why
00:39:00.160
it happened what in your system was vulnerable and allowed this attack to happen and by the way don't ever blame
00:39:07.680
the hackers you don't control them they don't owe you anything it puts you in a position of
00:39:15.839
you're giving away all of your power and responsibility when you say those terrible hackers did this terrible thing
00:39:21.000
the terrible thing that happened is that you left that hole in your system in the first place after you f fix it tell them what
00:39:28.920
you did to fix it that's the first step to rebuilding their confidence and they're going to want to know and then
00:39:34.960
tell them what you're going to do in the future to make sure things like this aren't as likely to happen what you're
00:39:41.000
doing how you're changing your security practices how you're improving your security practices to make things better in the
00:39:48.160
future and tone here is also very important so be humble in in
00:39:55.960
defeat um be candid about what happened don't gloss over important
00:40:01.560
details because they make you look bad or you're embarrassed you're going to be embarrassed you're going to be maybe the
00:40:07.319
most embarrassed you've ever been in your life we'll deal with it um but you still have to present some
00:40:13.880
error that you kind of still know what you're doing so try to be confident about the steps you're taking to fix
00:40:19.040
this in the future but don't don't use subjective wordings talk about fact
00:40:27.920
talk about exactly what happened be objective and here are some some good
00:40:33.079
examples of of how not to do all the things I was just talking
00:40:38.800
about do you guys remember when LinkedIn got hacked here's here's the first thing
00:40:45.280
they said on June 6th uh of this year they said that our security team
00:40:51.400
continues to investigate this morning's reports which implies that they learned
00:40:56.440
that they were getting half by reading blogs and that's not good uh and they
00:41:02.319
also are unable to confirm that any security breach is occurring and a lot of a lot of
00:41:08.520
companies say this and then come back to you a day or a week or a month later and
00:41:13.599
confirm that they were in fact hacked no very few companies say hey we might be getting hacked oh wait just kidding just
00:41:20.880
kidding everyone and while they were unable to confirm it you know who was
00:41:28.680
the same day cat a Blog that knew a very accurate
00:41:38.319
number of passwords that were that were available hash passwords that were
00:41:44.160
available that's a very accurate that's not like a few million that's 6.46
00:41:50.480
million passwords why why why is that number so accurate I'll tell you why because they were on the internet
00:41:56.720
already and they just did a word count DL that's it that's easy uh so the
00:42:02.440
internet knows the data is living right there on a public site for anyone to look at but LinkedIn doesn't know
00:42:09.000
they're still investigating they're still reading the blogs they're going to wait for the blogs to come to a consensus about what happened to their
00:42:14.520
system and then they'll tell you about what that was um and there were Security Experts had already found their hash
00:42:20.960
passwords in linkedin's list so they had confirmed that passwords that they only used on LinkedIn were in the list of
00:42:28.000
passwords supposedly retrieved uh hash password supposedly retrieved from LinkedIn and if if you're following that
00:42:34.280
that implies that it's impossible for that list to not be accurate uh and then three days later
00:42:40.200
three days later they say yeah uh there were some passwords on that list yeah we're sorry about that guys that's bad
00:42:48.720
um and they they believe that they've fixed the problem but if you remember they never
00:42:55.359
took any systems offline they they never had any loss of service so I have doubts um they also
00:43:03.000
said that and and I love this all of the passwords on the
00:43:11.280
published List have been disabled I don't think it's a good idea
00:43:18.480
to rely on information provided by the person that just hacked you to
00:43:23.599
determine what the impact of their hack was do you think it's possible that they
00:43:29.640
kept half of the list in reserve maybe you seems like a thing they could do and
00:43:35.720
then they start falsely reassuring People based on this completely fallacious assumption we don't think
00:43:41.520
your account's at risk you're probably fine don't worry which is exactly the opposite of
00:43:48.599
what a proper incident response or disclosure would look like and then they said that they've added an additional
00:43:55.640
layer of technical protection known as this thing they should have been doing the whole
00:44:02.400
time that was it that was the whole response and then they had a blog post about how you can use your LinkedIn
00:44:08.319
account to get a job that was it that was that was the whole response e-harmony got hacked on the same day on
00:44:15.640
the same day June 6th after investigating the reports how
00:44:21.480
are the why are these people investigating reports where are these reports coming from
00:44:27.920
why aren't they investigating their their systems and their auditing and
00:44:33.440
their incident detection tools and learning from from from those and making it clear that they actually were able to
00:44:41.079
do so and not wait for blogs to tell them that they're being hacked so they've found that a small fraction of
00:44:47.680
their user base has been affected which is once again a gross minimization of
00:44:53.000
the risk because it's very difficult if someone gets access to your passwords
00:45:00.960
for them to stop they're not going to say well we got like 1% of them are we good now
00:45:06.599
we're good right we can just not just just we can stop the download now we're good okay awesome let's move on and as a
00:45:12.440
precaution they have reset affected members there's this idea that keeps cropping up that we know better than the
00:45:19.640
hacker does who got screwed and now we're going to let you know whether you got screwed or not
00:45:26.319
what's going to let them whether they got screwed or not is whether that password they shared with their Gmail
00:45:31.839
got them access to a password reset to their bank account which depleted their
00:45:37.079
their savings account that's how they know if they're affected not because you tell them so never make the mistake of
00:45:44.040
of telling users that you know which ones were affected you can't possibly know oh and by the way the most popular
00:45:49.559
LinkedIn passwords were things like God and work and one two 3 4 and
00:45:54.839
password so talk about some good examples of what you should do and one
00:46:01.280
of my favorites in recent memory is the way that GitHub handled a recent um private key
00:46:07.200
vulnerability caused by rail's uh Mass assignment vulnerabilities which is a if
00:46:13.319
you read the rails security guide which you guys all have yeah it's in there I
00:46:19.040
guess they didn't uh but but they did handle it very well after they were attacked at around 9:00 a.m.
00:46:28.760
they discovered it's implied that they discovered that a GitHub user exploited a security vulnerability notice that
00:46:35.000
what they don't say what they don't say is we've heard some reports that there may be a problem with our private Keys
00:46:42.480
what they don't say is we don't believe very many users have been affected by this vulnerability what they say is here is what
00:46:48.800
happened here and by the way this was very this this thing that came out
00:46:56.240
this blog post was and within hours of the incident not days in the same blog post they make it
00:47:03.240
clear that about an hour later they fixed it and had already started an
00:47:09.359
investigation into the root cause of the attack which they then told us the root
00:47:15.040
cause of the vulnerability was a failure to properly check parameters Mass assignment and then they talked about
00:47:22.200
what will they do in the future to make it better they're going to do a full audit of their code base seems like a good
00:47:28.599
thing to do I'm glad they're doing it so that is the full
00:47:33.640
story what happened why did it happen how did we fix it what are we going to
00:47:40.079
do in the future and the thing that they don't mention here is who was affected this is
00:47:45.440
the only drawback that I have to to github's uh disclosure they don't do a great job of mentioning who could
00:47:51.480
possibly have been affected by this vulnerability so that's a thing they could have done better
00:47:57.040
uh Last FM recently got hacked as well this has been a big year for for people
00:48:02.400
getting hacked um and their initial response was we're investigating the leak which not by you
00:48:10.720
know reading blogs but by like investigating things like you're supposed to and as a precautionary
00:48:16.040
measure and before they were even sure of the full extent of the damage they asked all of their users to change their
00:48:23.160
passwords all of them not the few that they believed were were affected all of
00:48:29.680
them they found the file reporting to have hash passwords they checked it
00:48:36.040
against the database and then while they're still in the process of getting all the information they take action to
00:48:42.599
to help their users protect themselves and to take action they used as many
00:48:49.680
communication channels as they could to reach as many people as possible they didn't stick a post on the blog saying
00:48:54.799
hey you might want to change your password but no one should read this blog post they used every communication means at their disposal to reach as many
00:49:01.319
people as possible so I'm running out of time uh I'd like to conclude with a few thoughts
00:49:07.000
one of one of them is that I hope that I've convinced you that security is is both important and very difficult and
00:49:13.520
that security will only happen for your organization your organiz your your code will only get more secure if you guys
00:49:20.240
all care about making it happen uh so please do care about this it's it's very
00:49:26.160
important uh and also pigs are scary if I have time for questions I'm
00:49:33.480
happy to accept them this is right before the lightning talk so you guys are free to bug out if you want
00:49:45.079
to Josh Josh is going to Heckle me what's the best way to protect yourself against being eaten by a pack of pigs
00:49:55.119
sh in the back
00:50:00.400
I can't qu I'm can you stand up and yell at me please thank
00:50:09.160
you they're all terrible they're all terrible uh we we
00:50:14.960
just write a lot of stuff there it's it's again the problem so the
00:50:20.359
question is do we use any any risk management tools uh internally to help us do all that stuff that I said was
00:50:26.000
really hard and we actually don't we we do all the hard work
00:50:31.160
ourselves yes yeah I Wasing the example I guess LinkedIn you know their response
00:50:37.720
to describe and certainly you know that's a bad example but their response to describe how they're fixing it in the future mhm they said we're solving
00:50:47.000
yes but they sounded so
00:50:52.359
proud which it wasn't yeah but more where do you draw a line
00:50:59.240
between we're going to tell you our implementation and when that implementation I mean something is the
00:51:05.160
standard practice but right it's a standard so the question is where do you draw the line between full and honest
00:51:13.200
disclosure and not saying things that will make you look bad is that well no no how to say things that won't make you
00:51:18.480
nor right right how to not give the potential new attackers more information to to make you more vulnerable yes thank
00:51:25.319
you um that is an awesome question and the idea is that you should give users as much information as
00:51:33.760
they need to know to make effective action uh and if they don't need to know
00:51:40.000
it to take effective action or to understand what's happening then you shouldn't tell them if that makes sense
00:51:46.640
uh saying something like we're salting our passwords now doesn't really open you up to much risk when you say it in
00:51:53.040
this General tone of look how secure we are it just makes you kind of look bad any other questions on the left
00:52:13.359
here so the question is um what do I think of rails needing to get hacked in
00:52:20.359
order for GitHub to take action about I'm sorry
00:52:34.200
right so after their organization was hacked rails changed some settings so that they prevented that kind of thing
00:52:39.799
from happening in the future uh I would I would say two things one is that that's a failure of secure by
00:52:46.799
default uh two is that because we're really bad at predicting risk and
00:52:53.280
assessing risk they didn't realized that they were vulnerable uh probably because they
00:52:59.480
hadn't conducted a threat assessment there in the front right for those of us
00:53:07.480
who hav't beening with risk what would you say maybe like the top three things that you would focus on when you're
00:53:14.280
start a new app besides Mass so for for people who
00:53:20.040
are starting coding uh with risk in mind who haven't done so before what are the what are some top things to think about
00:53:26.520
specifically like ra specifically in a rails app I'm glad you asked let's talk about web Security on
00:53:41.640
plant so there's a guide uh that you've all read right you've all read this guide yeah show of hands read the guide
00:53:49.160
better than I expected so this is a great place to start but I'm still going to talk about all those things um
00:53:54.480
because repetition is important because repetition is important so session hijacking which can be prevented by
00:54:00.799
using https Mass assignment that can be uh protected by using Adder accessible or
00:54:06.559
new rails for uh param I don't know strong par parameters
00:54:12.000
thank you csrf which can be avoided by not using unsafe gets that that take
00:54:17.640
that cause change in your system and by using a security token and things that aren't gets so that means using
00:54:24.119
protection from forgery including the csrf metat tag so your Ajax calls have the have the header uh you guys are
00:54:31.200
probably all aware of these but these are the things that you have you start thinking about and then from here you
00:54:37.119
progress on to um trust relationships between different components in your system
00:54:42.640
especially many rails apps are becoming more distributed over time and involving more components so how secure is redus
00:54:49.599
for instance uh how much trust are you putting in redus things like that is that any others we're running out of
00:54:57.000
time I assume front left uh so you talk about not having iner a
00:55:02.920
to management yes how do you
00:55:09.960
feeling hiring come so the question is what do I feel about hiring a third
00:55:16.200
party to come in and help you become more secure by doing a number of different things I think that hiring a
00:55:22.480
consultant is a terrible idea because one person won't have the the the breadth of skills necessary hiring a
00:55:28.760
consultancy that can bring a number of people with different skill sets to bear on your problem is a much better way to
00:55:34.400
go about it um I don't think that blackbox testing and Pen testing are as
00:55:41.640
effective as giving those people the keys to your castle and letting them see inside the
00:55:47.119
Box because the best that blackbox testing can do is say either we didn't
00:55:53.599
found find some things so we think you're secure or we found some things so go fix them and then you'll be secure
00:56:00.359
but it won't find everything there's no real incentive for these Consultants
00:56:05.520
after they found the first three or four things to keep looking for more so I think that without an actual
00:56:11.200
understanding of how your system works they just won't be as effective so I would resist the urge to take the the
00:56:16.520
cheaper option which is often pen testing and blackbox testing and I would have a full code audit done and I would
00:56:23.440
have them use that as the basis of their their probing of this system so that's I'm sure all the time
00:56:30.960
we have thank you all so much uh for being awesome and and please try to be awesome in a more secure way in the