00:00:03.750
hello Singapore that was good we can do
00:00:10.180
better hi Singapore thank you I was a little
00:00:15.940
bit worried about this timeslot because I know the only thing standing between me and your Friday hugs from Erin is
00:00:24.220
this like 30 minute talk but then I started thinking about the fact that the only thing standing between you and your
00:00:32.399
beer or preferred evening beverage is Aaron's talk and then I felt better so
00:00:39.000
with that out of the way um hello my name is vitae he and just to get a show
00:00:46.420
of hands in the room how many of you grew up either reading or listening to
00:00:53.380
fairy tales okay it's a good number meet
00:00:58.420
you how many of you are familiar with the story of Goldilocks and the three
00:01:03.760
bears okay sizable number for those of you who don't who might not be familiar with the
00:01:10.149
story to summarize basically there's this girl named Goldilocks and she breaks into a
00:01:17.859
house and the house is the home of three bears and she basically like goes
00:01:24.009
through all of their stuff which seems like a terrible thing to teach children because like if you're going to break
00:01:29.109
into anyone's house it should probably be like you know puppies or bunny rabbits I don't think bears are the most
00:01:34.299
friendly animals to do that to but that aside she basically goes through all of
00:01:40.779
their things and she like has the mama bears porridge and the Papa bears porridge and one is too hot and one is too cold and the entire story is her
00:01:47.740
trying to find the thing that is just right so I got to thinking about what
00:01:54.240
Goldilocks is story might be like if it was reimagined and if we were to do a
00:02:00.310
modern-day retelling of it so surprise it's story time we're going to do a
00:02:07.149
modern-day retelling of Goldilocks and the three bears however in our
00:02:12.640
modern-day retelling Goldilocks is / you know breaking into large furry
00:02:18.910
animal houses instead she is a ruby developer because of course she is so
00:02:30.330
foldy watches at her very first deaf job ever and she is getting to build code
00:02:37.570
and she's getting paid for it and she's like oh my god this is amazing I'm learning so much and she's working at a
00:02:43.150
small consultancy and she basically has at her disposal all of these engineers
00:02:49.240
that she can ask her questions to and she's like this is great amazing I'm going to learn a lot and she does learn
00:02:55.360
a lot and she has a lot of fun but after a few months she's definitely going
00:03:01.750
things and she's contributing to projects but she's not really sure if she's getting any better at programming
00:03:08.830
is she writing better Ruby code she doesn't know and she starts talking to
00:03:14.500
other developers and she realizes that at her company she doesn't really get
00:03:19.510
any written feedback and part of the reason for this is that she doesn't really get real code reviews her team is
00:03:27.220
really small and if she wants code review she has to ask for it and she has to sit down with another engineer and
00:03:32.830
tell them hey can you look at my code and can you tell me what could be better and what what I'm doing well and she
00:03:41.530
finds us a little a little frustrating because she knows that other companies do have a bit more formalized code
00:03:46.870
review processes so after about a year there she decides that this company is
00:03:53.560
not the best fit for her anymore so she goes to another company hoping that she'll find what she's looking for
00:03:59.800
there and her second dev job things are very different
00:04:05.850
everyone is code reviewed and she even gets to code review in fact at her
00:04:11.500
second week on the job she reviews the CEOs code and she's just like what me
00:04:17.680
really I could have technical opinions it turns out she can and she was really
00:04:23.530
excited about that but even up the small startups that she
00:04:29.200
thought there are some limitations so one of the limitations is that there's
00:04:34.690
not very many people and so as long as one person reviews a pull request it
00:04:39.850
gets merged in and this tends to be fine and code gets merged in keeps getting
00:04:51.910
merged in Mossel requests but there are some drawbacks one of the things that
00:04:58.540
she notices is that her team will sometimes create pull requests and they'll devolve into long comment
00:05:06.460
threads and sometimes people will be like we should use tab no we should use
00:05:11.650
spaces we should use hash rockets why are we talking about this and it'll just
00:05:17.140
keep going and they're not always the most fruitful conversations but things
00:05:22.810
are fine code keeps getting merged in and things are getting shipped and deployed and it seems to be working fine
00:05:29.310
until one day so boldi locks has been working on this feature for about two
00:05:36.100
weeks and she has poured her heart and soul into it and finally she submits a
00:05:41.560
pull request and two engineers give it a thumbs up and it gets merged in and
00:05:46.720
she's like you know what okay I'm gonna go for coffee I'll see you kids later she comes back 20 minutes later to find
00:05:54.520
that the entire engineering team is freaking out and there's a fire and the
00:06:01.780
fire is that the entire staging environment has just like gone down in flames and they're like Goldilocks what
00:06:08.650
did you do you just merge this giant feature in you did something and she's like well you reviewed my coat both of
00:06:16.330
you reviewed my code so I don't know what I did why didn't you see it
00:06:21.450
but not to be deterred she decides that she's going to fight this fire and what
00:06:28.450
she finds out is really interesting turns out in this very large pool request of like 20-some files
00:06:36.920
he had added a controller and turns out somewhere along the line of rebasing she
00:06:45.680
had two controllers and one was empty and one was not and the empty one was overriding the one with all the code and
00:06:51.740
the entire application has come crashing down so by committing an empty file that no
00:06:57.920
one noticed she basically caused kind of a mess so this gets her thinking oh we
00:07:09.920
really thought that we had this code review process down ha no nope nope nope
00:07:17.480
we don't because clearly if we had it down perfectly we wouldn't have had this problem occurred I wouldn't be fighting
00:07:24.080
this fire so she starts thinking okay
00:07:31.480
there must be a better way of doing code reviews right right so Goldilocks being
00:07:43.100
the Ruby developer that she is does what probably any of us would do she does
00:07:50.480
some research and she asked the internet and she reads about this ancient text
00:07:58.390
called code complete which dates back to
00:08:04.090
the old year of 1993 it's a really great book it's written by Steve McConnell I
00:08:10.700
highly recommend that you read it and in reading code complete she starts
00:08:16.220
learning about the theory in the history of code reviews and why we started doing them as teams and as an industry in the
00:08:23.120
first place and in code complete Steve McConnell talks about a lot of different
00:08:28.430
like research that he's done and the theory behind code reviews and one of the things that he says is that code
00:08:34.490
reviews one part of managing a software engineering process is catching problems at the lowest value stage that is at the
00:08:42.890
time a tweet at which the least investment has been made and the problems cost the least correct
00:08:49.910
and basically what he says is that code reviews are meant to act as quality
00:08:55.400
gates just as we test and review the products that we build code reviews are
00:09:01.280
meant to do that for our code bases but as she starts thinking about the teams
00:09:08.030
that she's worked on the two team she realizes that none of the code review
00:09:13.520
processes that she's been a part of really feel like they're of good quality
00:09:18.640
they kind of feel like formalities so
00:09:25.060
she keeps reading and she learns about something called the idea of collective
00:09:31.940
ownership of construction and this is what code reviews are meant to Center
00:09:37.580
upon this idea that if all of us have a hand and a stake in what we're building
00:09:43.700
together then our code will actually improve as a consequence and Steve
00:09:52.610
McConnell has done a lot of research and he found a lot of studies and he writes about this in code complete and she
00:09:58.580
learns about a few of the different benefits that code reviews actually give
00:10:03.770
us on our engineering teams one of those is that if you have multiple sets of
00:10:09.530
eyes on a code base you actually end up with better quality code seems obvious
00:10:16.760
but for those of us who might not do code reviews regularly it actually is a
00:10:21.950
huge factor another thing is that if someone leaves the team and you have a
00:10:29.330
code review process that's formalized it's a lot of you're here to pick up where they left off if someone leaves
00:10:37.430
the team and everybody is at least looked at the code in review even if they've never worked with it before
00:10:42.590
it'll be a little bit easier for them to dig their teeth into it if they need to add a feature or fix a bug finally
00:10:50.860
another great benefit of having code reviews is that the process of detecting
00:10:56.690
defects the process of catching bugs goes a lot faster if some ones that
00:11:01.940
we've seen the before or reviewed it if you've never seen a file ever and you're trying to
00:11:08.210
find a bug in it it's a lot tougher so
00:11:13.670
it turns out that in the late 80s a time that I you know can't really imagine
00:11:21.020
anymore but back then code reviews came in three shapes and sizes one was
00:11:28.670
something called inspections and this is actually something that was started by a guy named Michael Sagan at IBM and
00:11:34.600
basically what he did was he started this hour-long code review process where
00:11:40.820
there would be three people a reviewer the author of the code and a moderator and they would basically sit down and
00:11:48.080
they would focus on defect detection they would focus on going through the code and looking for things that could
00:11:53.510
potentially go wrong basically trying to snag bugs the important thing was that they were not correcting as they went
00:12:01.040
along and there was a great talk this morning about not fixing bugs while refactoring to that same point it's
00:12:08.690
really good to not fix bugs while you're reviewing instead keeping the review
00:12:18.170
process as its own thing is really important according to Steve McConnell's research so these really long code
00:12:26.120
review process is called inspections took a little bit of upfront time and
00:12:31.400
effort but it turns out that they were actually really beneficial to the teams that use them the three people the
00:12:38.810
reviewer the author of the code and the moderator would come in with the checklist and they would all know what
00:12:45.470
they were looking for and what their roles were in the code review and when IBM started implementing this and when
00:12:51.920
they shared it with other engineering teams it turns out that before using
00:12:57.200
inspections and after 60% of defects in code bases were caught and there were 20
00:13:05.810
to 30% fewer defects per 1000 lines of code after using inspections on teams
00:13:11.920
the most important part of what these code reviews used to be was that after a code review process
00:13:20.870
everybody on the team would sit down and basically talk about what went well and
00:13:25.920
what didn't and they would take that data and recycle it back and they would iterate so that each time they did an
00:13:33.120
inspection it would get better with time
00:13:38.330
but not everybody has thousand lines of code that they're adding to a code base sometimes you're adding only a few lines
00:13:45.170
this brings us to our second type of code review this is something called
00:13:50.310
shorten views and the basic idea here was that a lot of teams weren't doing
00:13:55.830
any type of code review process for one-line changes or five line changes
00:14:00.900
and it turns out that based on research
00:14:06.440
those five line changes and one-line changes are actually the ones that are
00:14:11.730
the most painful and the most error-prone teams that introduced short
00:14:17.250
reviews found that they would go from a 55 percent error rate to just a two
00:14:22.710
percent error rate and those that didn't use short reviews and then implemented
00:14:29.040
them found that they had an 86% correctness rate before having shorter
00:14:34.140
views and then afterwards they had a 99.6% correctness rate so that's the
00:14:43.470
second type of code review the third type is one that Goldilocks seems kind of familiar with and this was called
00:14:49.710
something called a walkthrough and this is effectively a type of working meeting where for 30 to 60 minutes you have two
00:14:58.440
engineers usually a more junior engineer and a more senior engineer and they talk about the code that's being submitted
00:15:05.700
for review and technical choices and details that have been made and how to
00:15:11.490
make them better this was not nearly as effective it is helpful but not nearly
00:15:18.839
as impactful as a formalized kind of code review teams that use walkthroughs
00:15:23.850
found they protect approximately 30% of errors in a program
00:15:28.930
which is fine but it's not the most effective and the idea with walkthroughs
00:15:35.470
is that they're a great learning opportunity for junior engineers to ask questions of senior engineers and for
00:15:42.399
them to also give pushback against ideas and kind of like propels a team board
00:15:51.660
Goldilocks also stumbled upon something called the peer-review sophistication
00:15:57.370
scale and she kind of looked at this and started thinking about whether any of
00:16:04.630
the code reviews that she had been a part of were actually of any good were they even valuable and as she thought
00:16:13.000
about this and looked at the scale she realized that most of the team she's worked on she probably fell it number to
00:16:20.820
maybe a number three and she got a little bit worried and she wondered am I
00:16:28.240
the only one that feels this way does anybody else feel like this too and this
00:16:34.900
is all theory but like in practice what are other teams doing do any of us have
00:16:40.120
this code review sing down so she decided to debug this whole idea of code
00:16:47.350
reviews in the industry today and because Goldilocks is a modern day Ruby
00:16:53.020
developer she did what any of us would do she has Twitter I don't know if this was
00:17:01.510
you know the best decision for her but she did it anyways because we know all the things come from Twitter and she
00:17:09.069
basically created a survey and asked all the developers that she knew to spend a
00:17:15.220
couple minutes and answer a couple of questions about what their code review processes were liked and whether they
00:17:20.319
were happy with them at all or not the little disclaimer Goldilocks is
00:17:26.170
definitely not a data scientist so she probably could have done a better job of collecting data and analyzing it but
00:17:33.840
it's a start one of the ways that this data is a little bit limited is that
00:17:39.429
it's obviously going to only be people who answer the survey so it's not a great
00:17:45.420
representation of all developers but it's something that we can use at least
00:17:53.930
so the first thing that you learned is that the majority of people answering her survey survey worked in Java I don't
00:18:04.380
know why they also used Ruby and rails and JavaScript and there were some other
00:18:11.640
languages as well but for the most part that was the majority of people who were answering the survey something to keep
00:18:17.100
in mind and she found that the majority of people who responded did feel like
00:18:23.730
code reviews made them better developers so they seem to be on the same page as her that code reviews were overall a
00:18:30.540
good thing okay cool we all seem to be on the same page what else can we agree
00:18:37.170
upon well it seemed like Swift
00:18:42.780
developers found code reviews to be the most beneficial they came in and about a 9.5 on a scale of one to ten and Ruby
00:18:52.350
developers came in second so not bad 9.2
00:18:57.920
so everybody seemed to think that code reviews were helping their teams what
00:19:03.840
was interesting was that while most people did have all their pull requests reviewed for their teams there was still
00:19:10.200
a whole ten percent of people who had to ask for review ten percent of
00:19:16.470
respondents didn't have any type of formalized code review process they had to ask for it which made her feel a
00:19:24.060
little bit better because she had been in that position before to what was even
00:19:29.460
more interesting was that it didn't really matter what framework you worked with or what language some teams seemed
00:19:38.310
to have a formalized code review process down and others just didn't which led Goldilocks to believe that it's whether
00:19:45.270
you have a code review process that works has nothing to do with what language you use or what framework and
00:19:50.550
everything to do with the people on your team's another interesting thing
00:19:58.310
that she found was that like her startup most teams only needed one person to
00:20:04.820
review a few people needed more reviewers but generally one reviewers
00:20:10.940
seem to be enough however those reviewers would spend only 5 to 20
00:20:17.420
minutes doing a review of a pull request and there are some people who who had to
00:20:24.680
wait up to a day or even a few days to receive review from their teammates so
00:20:35.200
as she read through some of the responses in her survey she saw that a
00:20:42.590
lot of people were doing code reviews but not all of them were happy without
00:20:47.690
they were being conducted and she kind of boiled it down to two aspects the two
00:20:55.340
things made made or broke a code review worked time and substance one person
00:21:06.440
wrote into her survey and they said we have one developer who blindly thumbs up
00:21:12.770
every PR and rarely leaves comments they're attempting to gain the rule of
00:21:19.550
at least two approvals it's easy to tell because inside of one minute they've
00:21:26.720
approved three pull requests someone else rode in and they said if
00:21:34.999
the second if there's a second and third reviewer there are more likely to just
00:21:40.009
rubber-stamp after seeing the first person's review so Goldilocks started
00:21:46.039
wondering are we just going through the actions of reviewing each other's code
00:21:51.289
like are we considering time and substance because it seems like a lot of
00:21:56.869
us are unhappy with it so let's figure out what we mean when we say time and
00:22:02.269
substance the amount of energy in the
00:22:08.899
amount of time that you spend on the code of you turns out to be pretty significant in how happy engineers are
00:22:16.720
with a code review process so what we mean by this is who's doing the review
00:22:22.879
and how much time are they spending on it because that seems to really make a difference it's not just the act of
00:22:30.980
doing the code of view it's how much weight they carry and who's actually
00:22:37.309
doing the work someone wrote in to her survey and they said everyone on the
00:22:42.860
team should receive equal review I feel like it's so common for senior people to
00:22:49.190
get no feedback because people assume that they can do no wrong they totally
00:22:55.820
can and they might want feedback junior people get nitpick to death and people
00:23:03.499
seem to forget that self-esteem can be easily affected and that when someone's
00:23:08.749
submitting a code review they're being vulnerable another person wrote in to
00:23:14.899
her survey and said that code reviews need to be acknowledged as a first-class citizen they are a legitimate activity
00:23:23.360
that needs time and focus and as she read through these responses she noticed
00:23:29.720
that there were a couple common themes that kept popping up developers seem to
00:23:35.419
be really frustrated by code reviews that had very large commits or really
00:23:40.909
long PRS and those that provided no context it meant that you would have to spend more
00:23:46.720
energy and time on the code review and that seemed to be a huge barrier it was
00:23:52.000
also very frustrating for developers when different people were sending different amounts of time reviewing
00:23:58.500
particularly when junior developers would get more criticism and longer
00:24:04.240
reviews and senior developers wouldn't and another point of frustration was
00:24:10.390
when technical leads or upper management or ctos would basically prevent you from
00:24:18.240
as a team treating code reviews as first-class citizens when code reviews
00:24:24.460
were down valued developers felt more frustrated by their team's overall
00:24:29.590
process and productivity so what about substance well as it turns out even if
00:24:39.400
everyone reviews and is reviewed what they're saying and doing while reviewing
00:24:46.060
is pretty important so when we say
00:24:52.060
substance what we're really talking about is how a code review occurs so
00:25:01.750
goldilocks it up like a find and file in all of her responses and she found that 5% of people just automatically had a
00:25:11.020
negative reaction with the phrase code review and they were they basically use the term nitpick like I hate the code
00:25:17.530
reviews they're so nitpicky they're terrible and I was like five percent of her respondents just automatically which
00:25:23.410
really spoke volumes about how people feel about code reviews and what they actually are doing to the team's morale
00:25:32.700
so as it turns out there are a couple of things that seem to be points of
00:25:38.680
frustrations and points of frustration for what a code review consists of people really did not like tons and tons
00:25:46.210
of comments on their full requests they would much rather have a conversation about it
00:25:53.240
they also seem to be really frustrated by having conversations about style and
00:25:59.610
syntax and that seemed to be an indicator of a negative code we process
00:26:06.320
they did however like having conversations about content and Mantic
00:26:11.540
those seem to be indicators of the more positive code review process the biggest
00:26:17.790
one however was people who had empathetic code reviews generally seem
00:26:24.510
to be happier with their workflows whereas people who had experiences with
00:26:30.840
very egotistical reviewers or submitters of code reviews were more frustrated by
00:26:38.820
the entire process and thought of it in a very negative light one person wrote
00:26:44.070
into her survey and said that you should review the code not the person let tools
00:26:49.350
handle styling changes for you so that you don't have to spend time with this as a team another person wrote in and
00:26:57.840
said if I ever find myself going back and forth on something via comments I'll
00:27:04.050
just ping the other person and ask them to pair sometimes it's just a lot easier
00:27:10.410
to talk to someone another person wrote in and said that on their team there was
00:27:18.480
trolling of code and they didn't even have any kind of code of conduct that
00:27:24.180
was really hard to stop this and this was both disheartening but also kind of
00:27:30.390
uplifting for Goldilocks because she realized that she was not the only person that had gone through this she
00:27:37.320
was not the only person who had been disappointed by code reviews in the past and in reading through her survey
00:27:45.480
responses she realized that one of the major factors of what had changed between 25 years ago when Steve
00:27:54.120
McConnell wrote code complete and talked about the theory of code reviews and engineering teams today was that a lot
00:28:01.860
of teams didn't have any concept checklist of what a code review should or shouldn't be and even worse they had
00:28:09.370
stopped collecting data and iterating on their code review processes people had
00:28:15.160
starts talking about it and people had stopped thinking about whether it was actually effective they were just going
00:28:20.830
through the motions of doing them so I
00:28:26.220
told you that this was a fairy tale and we've been going along this journey with Goldilocks but I have been not entirely
00:28:35.140
truthful with you because as it turns out this is not a fairy tale I'm
00:28:41.530
Goldilocks surprised I know
00:28:48.090
plot twist I know nobody saw that coming so you gave I don't actually work it
00:28:56.290
either of those two companies I work at a company called tilde in Portland Oregon and we build something called
00:29:03.640
skylight which is a favorite rails profiler if you need something to profile your rails application you
00:29:09.490
should check us out and to be totally honest with you I don't even know if
00:29:17.680
we're doing code reviews right what I do know is this we have a lot of
00:29:24.460
conversations about them and every engineer on our team feels like they can talk to the rest of the team and bring
00:29:31.870
up issues or points of frustration if they have them one of my co-workers actually she submitted an RFC to totally
00:29:39.100
change our code review process and that RFC was reviewed many times with a lot
00:29:44.500
of comments but we decided to adopt and that's now the process that we use as a
00:29:50.710
team so I can feel like I can bring up any issues to the rest of my colleagues
00:29:56.830
and we can talk about that and talk about them and iterate and change them so based on all of this research that
00:30:05.590
I've done and all the people that I've talked to and the survey I wanted to share with you a couple of things that
00:30:11.260
you can do small and big to change your code review process see
00:30:16.720
if you're not happy with them because clearly as an industry we all have some work to do on this so a couple of small
00:30:23.950
wins that you could implement when you leave here today commit hooks are really
00:30:30.610
awesome they basically make sure that you can run whatever tasks you need to
00:30:35.920
before you push anything to github they'll make your life a lot easier I recommend trying them out
00:30:42.630
github also has an amazing feature called templates and we use this at
00:30:49.020
Tilda where basically when somebody opens up over pull request this template
00:30:54.310
shows up and it has like a checklist of things you need to do before you can submit a pull request things you need to do before you can ship and it's really
00:31:01.210
great because people can immediately look at it and know whether something needs to work so or something is ready
00:31:07.330
to be reviewed another thing that's super super helpful for your team mates
00:31:13.030
especially if you have a large team or you have people who might be more
00:31:18.070
comfortable with certain parts of the codebase is giving them context giving
00:31:23.140
your teammates context makes it a lot easier for them to target their efforts and Don gave a great talk today about
00:31:29.080
how human beings have like a certain capacity to focus and hold things in memory this is going to make it a lot
00:31:35.770
easier for your team mates to be able to focus on what changes you made it's
00:31:41.230
really helpful to like add a Jif or a screenshot and show them like what is the thing that happens before the pull
00:31:47.050
request and what can you expect to happen after it'll make the process a lot easier and we'll probably put people
00:31:53.980
in a much better mood Lindner use them
00:31:59.830
we don't need to talk about like syntax and stuff like their the conversations that we don't need to have because we
00:32:05.440
have machines to automate those things for us so definitely use them and implement them tiny win but really
00:32:12.160
really impactful and lastly encapsulate everything whether that means working on
00:32:20.040
writing concise commits or creating small pull requests that are short and
00:32:26.320
kicking things in a really really concise compact ray encapsulate things so that
00:32:32.590
nobody is reading hundreds of lines of code and feeling overwhelmed and just giving it a thumbs up and moving on
00:32:38.460
because that's going to come back to haunt us later some teams have also some
00:32:45.400
that it's really helpful to assign specific reviewers particularly if you find yourself like waiting for someone
00:32:51.100
to review a long time it's also really great because you can make sure that
00:32:56.110
senior developers are reviewing and being reviewed and that junior developers also get a chance to look at
00:33:01.750
code that a senior developer might be working on and ask questions and learn about things that maybe they might not
00:33:07.690
be familiar with yet there are some big
00:33:12.940
wins that you can do it might look a little daunting but I promise you the payoff is super person first pushing for
00:33:22.570
a culture that values vulnerability and I think one of the best ways to do this
00:33:27.909
is by making sure that everyone puts themselves out there senior developers
00:33:34.480
can make mistakes I'm going to say it again senior developers to make mistakes and it's important for junior developers
00:33:41.740
to see that and to be able to ask questions when we're working in teams that are vulnerable enough to
00:33:48.789
acknowledge that we all become better developers and create a better product at the end of the day develop empathy
00:33:57.990
this is harder to do but you can do tiny things to make sure that your teammates
00:34:04.720
are becoming more empathetic engineers and people one of the ways that I really
00:34:10.869
like to do this is by calling out the things that I see another developer doing really well so if there's a method
00:34:17.290
that's named really beautifully or some sort of abstraction or something that's
00:34:24.070
been meta programs I really like to call that out and say that you know you did a great job that was awesome
00:34:29.700
that's so important because code reviews don't need to necessarily be something that's negative and you know a symbol of
00:34:36.669
criticism we have this collective ownership he and what we're building so when someone
00:34:43.129
does really well that's all our victory in that too and finally iterate if you
00:34:52.580
remember nothing else from today if this is so important outside again it's her rate and what I mean by this is
00:35:00.250
start the conversation if you feel like your code review process isn't where you'd like it to be and if you feel like
00:35:07.609
there's room for improvement or if you're really happy with it share it with other people this someone wrote
00:35:15.590
into the survey and I think they they highlight this way better than I could ever could it's a really eloquent quote
00:35:21.619
and I'm not nearly as eloquent they said I love code reviews in theory in
00:35:28.280
practice they are only as good as the group that's responsible for conducting
00:35:35.240
them in the right manner really it's on
00:35:40.730
all of us at the end of the day to make sure that our quality gates are of good
00:35:46.880
quality so if any of this data was
00:35:52.010
interesting to you I promise I'm not a data scientist but I really tried you
00:35:57.109
can find all of it at this website that our code reviews there were a lot of people who wrote into the survey and
00:36:03.470
they have some really amazing thoughts on what makes a good code review and what doesn't unfortunately it's way more
00:36:09.710
than I could sit into 30 minutes but you can find it all there and you can take
00:36:15.109
the survey yourself and tell me your thoughts on code reviews I would love to hear them and that's all I have for you
00:36:30.070
all right would anyone like to ask Goldilocks a question anyone yeah yeah
00:36:43.310
thank you for the talk I really appreciate the research you put into it it's definitely awesome so my company doesn't have a code review
00:36:50.060
process but we do do pair programming like the majority of the time and that seems to address some of the earliest
00:36:56.120
attempt to address some of the some of the quality issues that you wanted to
00:37:02.450
bring into code reviews and like communication in-person communication so does Goldilocks have an opinion on pair
00:37:09.290
programming she loves it I don't think I can do the rest of this question in the third
00:37:14.810
person sorry I'm a really big fan of pair programming we do pair programming
00:37:21.530
at tilde um so we're always working with pairs and switching them up but I think
00:37:27.140
what's really nice about it well what's really nice about having a code review process even if you do have pair
00:37:33.260
programming is that when someone for example if I'm working with my pair on a feature and someone else is working on
00:37:38.390
another part of this code base it's nice for them to be in tune with what changes are happening in a different part of the
00:37:44.690
code even if you know it's like at the end when they're submitting you know a feature PR or something like that
00:37:50.750
because things get merged into the code base all the time and it can just kind of go as like a stream of things on a
00:37:57.470
black channel and you don't know what's happening so I would still really encourage some kind of code review
00:38:03.830
process it doesn't necessarily need to be like an hour-long inspection or something but I think it's really beneficial to see what's going on in the
00:38:10.310
code and sometimes like you might even have some context to provide on what
00:38:15.500
you're working on and be like maybe we should name this method slightly differently we did something me and my
00:38:21.020
parrot is something over here you might want to like check it out and it might be beneficial to the two of you along
00:38:26.360
what you're doing so I still think that code reviews can be super beneficial but it might look different if you were
00:38:31.940
working in pairs great question okay so long channel twos
00:38:38.210
one so I have a two-part question the first
00:38:43.460
one is and it's a direct follow-on to that what does Goldilocks think about bear programming my god okay your
00:38:53.420
opinion or maybe you're just not allowed to talk about it so the next one is so
00:38:59.720
what is your ideal checklist look like or what's your good checklist to go through because this would be useful for
00:39:04.819
me um so like what do I look for in a code review basically right yeah I feel
00:39:14.720
like I should probably write this down that's a good question for me it's probably can I read the code and understand what's the problem trying
00:39:22.460
to be solved if I looked at it again in
00:39:27.770
six months is it clear to me if we left
00:39:33.619
someone left the company and what if there's a junior programmer looking at this would they be able to understand our methods name clearly are things
00:39:39.589
actually doing what they're doing and are there tests that's a big one cool
00:39:48.460
okay hi thank you for a great talk
00:39:56.329
thank you have you thought about Auto Bayliss my wings for Corey you all right
00:40:04.280
can you get up have you thought about all tonight the smell wins for Corey you
00:40:11.300
I haven't I haven't automated all of
00:40:16.609
those things um I know people who have I haven't personally been part of that I have I
00:40:22.069
would implemented Robocop at my last company but I know other people who have
00:40:27.230
like i think the most automation we have is like the PR templates and like having linters it to enforce style and things
00:40:34.849
like that oh nice because I know one tour is called danger system that can help all
00:40:52.180
like to share about our company sir I think it's better than the average code
00:40:57.339
of view but it's not perfect so this because the checklist that we're looking
00:41:03.160
is if if the even if it's about six sometimes if there's no test we'll ask
00:41:10.450
which the tasks and yeah like you said screenshot and maybe how to how to test
00:41:20.520
the feature and all that and I think if the the code review process also depends
00:41:27.849
on the people and I think in our theme we have we have our egos in check so we
00:41:33.099
don't really get affected by the criticisms and what the problem that we
00:41:39.040
are encountering is that for bigger features of course many people do review
00:41:46.089
but when it comes to the merging part no one wants to press the merge button together no one wants to be responsible
00:41:55.540
for if something breaks so because you mentioned about collective ownership yeah how do you think we can solve that
00:42:03.609
problem so I'll tell you one thing that we do at my company that has really
00:42:10.270
helped mitigate this which is that again we pair so one of the things I love
00:42:15.970
about that is that you really displace this concept of blame that it's not one
00:42:22.270
person who wrote a you know wrote something that they didn't write buggy code or something or they didn't you
00:42:28.210
know break everything or bring down production like there are other people who were involved in it too and it's
00:42:33.819
coding is not something you do in a silo and I think trying to create a team whether that means like having people
00:42:40.990
work on one feature and break it into pieces and take it apart and like you work on this and I'll work on that or
00:42:46.510
pairing on some tougher parts like if there's part of it the system that's more complicated having someone to work
00:42:51.730
with work with you through that is helpful but on a bigger level it means that it's never just less your fault and
00:42:58.780
I think is this something that speaks to something larger in the industry where this idea of placing blame and fault and
00:43:04.450
like you broke everything in brought down production look back that's never going to help the person or the team and
00:43:11.830
I think it ties into this idea of like to create a more empathetic team you need to like change the language you use in code reviews need to change the idea
00:43:18.580
that you know it's all your fault and it's all just on you whereas as you says
00:43:25.090
collective ownership having everybody have a stake in it and have a hand in it
00:43:30.750
that's a really great question I'd love to talk to you more about that own but that's a really hard thing to do because it depends on your team a lot okay so
00:43:39.130
one thing that we do when that happens sometimes is we just ping those people
00:43:45.160
on slap and socket about it yeah it could be but yeah even if you don't feel
00:43:50.800
blame people sometimes it's just a personal you know mindset that maybe
00:43:56.020
I'll break this maybe they'll blame even if they will not so I think it's a psychological problem as well yeah I
00:44:03.130
think changing those things are harder to do and developing empathy is not an easy thing but you can take tiny steps
00:44:10.090
towards it but it takes it's a harder if that's why it's like a big one not a small win because it takes a longer time
00:44:15.880
to do that okay I think we only have time for one more question so let's play
00:44:22.030
a game of fastest hands up okay all the hands down okay all the hands up first
00:44:27.250
so I know where to look can I get one sighs one two three is anyone else is
00:44:32.770
just a three of you okay hands down when I say three the hands come up again okay
00:44:38.860
one two three you cheated so things like taking the
00:44:50.620
time to do a thoughtful and empathetic code review and also taking the time to really contextualize the PR for the
00:44:58.300
reviewer um those take time what are
00:45:03.400
some techniques you think that TechNet teams can use to maintain momentum when they're pausing to put so much effort
00:45:11.680
into empathy and documentation that's a
00:45:18.220
good question that was a hard one sorry you know it's a good question so I guess
00:45:27.070
your question is like in making time to do code reviews how do you still make sure that you're productive okay to be
00:45:37.960
honest I think a large part of it is making code review as a like submitting
00:45:44.470
the review and doing the worker reviewing needs to be part of productivity because if you can do a
00:45:49.690
better code review and avoid like eight hours of dev time debugging something
00:45:55.000
that is a Productivity it's like pre-emptive right so part of that is like changing maybe upper management or
00:46:01.540
the tech bleeds concept and kind of like telling them and making them a stakeholder and why
00:46:07.600
code reviews are important but I think another thing that really helps is by breaking it down into small small pull
00:46:14.860
requests and commits you make it easier like the barrier for entry of how to get through reviewing is easier because then
00:46:22.750
you're not like oh I have this hundred line or you know twenty files thing to review I'll just save it for like 4:30
00:46:29.530
p.m. because I don't want to do it cuz I have my own work to do but if you make it a smaller thing that's more easy to
00:46:34.780
digest it's something that you know if you ping someone they can find some time the next hour hopefully to do it like
00:46:41.230
making it more bite-sized I think is another good way to make that a bit more
00:46:47.290
approachable and not something that slows you down and the team but I'm sure there are more ways but I have to think
00:46:53.140
more on how to answer the question so it's very good thank you thank you all
00:46:59.620
right let's give it up a Reiji thank you very much