Python Fundamentals Sheet

Download as odt, pdf, or txt
Download as odt, pdf, or txt
You are on page 1of 29

Pluralsight – Python Fundamentals

Chapter 1 – Introduction to the Python Fundamentals course

Where is python used? → Web Frameworks, Scientific Computing, Image Processing, Databases,
Build Systems, Documentation, Persistence, Math, Operating System, Crypography, Concurrency Web
Protocols.

Python is powerful, popular and open-source. Because of that it is accessible! This course is Example
driven, it features 10 modules :
1. Getting Started
2. Strings and Collections
3. Modularity
4. Built-in Types and the Object Model
5. Collection Types
6. Handling Exceptions
7. Comprehensions, Iterables and Generators
8. Defining new types with Classes
9. Files and Resource management
10. Shipping Working and maintainable Code

Python Overview : → Guido von Rossum – Benevolent Dictator for Life; Open-source project; General
Purpose Programming Language; Clear, readable == Expressive and Powerful; Different Python
implementations CPython – written in C; Jython, Iron Python, RPython; Two major versions Python2
and Python3. The Bytecode is invisible so the python is called interpreted language, which isn't true in
full. Python has a lot of modules, it is said that Python is “Batteries included” language. Python is also
a philosophy about writing code. - Pythonic Code; import this – Zen of Python.
Strongly and Dynamically Typed.

Chapter 2 – Getting Started with Python3

REPL – Read-Eval-Print-Loop >>> ; In REPL write _ to use last value ex. _ * 3


CTRL-Z to exit REPL on Windows, in Linux CTRL-Z puts the process in background, write fg to bring
it back

In Python whitespace is significant → 4 spaces in indenting levels. Significant Whitespace:


1. requires readable code
2. no clutter
3. human and computer can't get out of sync
Whitespace rules:
1. Prefer four spaces
2. Never mix spaces and tabs
3. Be consistent on consecutive lines
4. Only deviate to improve readability.
This is programming as Guido “indented” it.
Python Culture and the Zen of Python. PEP – Python Enhancement Proposal PEP, PEP8, PEP20 →
import this; Importing Module from the Standard Library → import module_name; use
help(module_name) in REPL to get info about libraries; VERBOSE → using or expressed in more
words than are needed;float / vs Integer Division // ; Biggest factorial is 12 because it grows quickly;
Scalar types → Int, Float, None, and Bool; Primitive and Collection Types
int – arbitrary precision integer – signed(rounding is always towards zero)
float – 64 bit floating point – float(“nan”, “-inf”, “inf”)
NoneType – none → the Null object → absence of a value
bool – True False – boolean logical values
bool() → type conversion, empty collections are falsey.

Zen Moment
Readability counts. Clarity matters, so readability makes, for valuable code.

Relational Operators – comparison operators => <= < > == !=


Conditional Statements → if expr:
expr is converted to bool as if the bool() constructor
Python provides the elif keyword to eliminate the need for nested if...else structures in many cases →
flat is better than nested <zen>

while loops → while expr:


break → the break keyword terminates the innermost loop, transferring execution to the first statement
after the loop.
CTRL-C → KeyboardInterrupt

Chapter Summary:
– obtaining and installing python3
– read-eval-print-loop or REPL
– simple arithmetic with +- */ % and //
– Assigning objects to named variables with the = operators
– print()
– Exiting the REPL → CTRL-Z on Windows and CTRL-D on Linux
– Significant Indentation – usually four spaces
– Python Enhancement Proposals
- PEP 8 – Python Style Guide
- PEP 20 – The Zen of Python
– Importing Python Standard Library modules: all three forms
– Finding and browsing help()
– scalar built-in types → int float None bool and conversion between types
– Relational operators for equivaalence and ordering
– conditional statements with if... elif... else
– while loops
– interrupting execution with CTRL-C to create a KeyboardInterrupt exception
– Breaking out of loops with break
– Augmented assignment operators for modifying objects in-place
– requesting text from the user with input()
Chapter 3 – Strings and Collections

Python Collections → Str, Bytes, List, Dict

str → immutable sequences of Unicode codepoints


string literals → '' or “” but be consistent

Practicality beats purity → Beautiful text strings rendered in literal form - simple elegance

strings with newlines

1. multiline strings “””


2. escape sequences \n → python3 has universal newlines so it isn't like \r\n in Windows

\r\n → carriage return, line feed


r'' → raw strings → what you see is what you get; raw strings supress the escaping mechanism

strings are sequence types so we can use ceratain operation on them; no separate character type >
characters are simply one-element strings; strings are immutable; help(str); python strings are Unicode

bytes – immutable sequences of bytes; similar to strings but used for binary data
b'data' → byte literal
there is bytes constructor but it is advanced

Zen Moment
Practicality beats purity. Beautiful text strings. Rendered in literal form. Simple elegance.

converting between strings and bytes

decode
=>str------
|… |
--bytes<=

.encode(“utf-8”) ↔ .decode(“utf-8”) ; files, network resources, http responses → byte streams

bytes is a sequence of bytes and strings are sequences of unicode codepoints and that's why we use
encode and decode

list -s mutable sequences of objects


list literals [a, b, c] → a[1]

dict – mutable mappings of keys to values {}


data isn't ordered

for-loop – visit each item in a sequence → for each in other languages


for ITEM in SEQUENCE:
----body----
CTRL-Z → exit the REPL on Windows
CTRL-D → exit the REPL on Linux
Chapter Summary:

– Single and multiline quoting


– Adjacent string literal concatenation
– Universal newlines
– Escape sequences for control characters
– Raw strings suppress the escaping mechanism
– convert other types with str() constructor
– Zero-based square-bracked indexing of strings
– Rich variety of string methods
– Python3 source encoding is UTF-8
– bytes is a sequence of bytes, str is sequence of Unicode codepoints
– bytes literals prefixed with lowercase b
– convert str to bytes with encode(), bytes to str with decode()
– lists are mutable, heterogeneous sequences of objects
– list literals delimited by square-brackets, items separated by commas
– Zero-based, square-bracket indexing to retrieve objects
– square-bracket assignment to replace objects
– grow lists with append()
– construct from other sequences using list() constructor
– dictionaries associate keys with values
– literal dicts delimited by curly brackets
– literal key-value pairs separated by commas, with a colon between key and value
– for loops take items one-by-one from an iterable object, binding a name to the current item
– correspond to for-each loops in other languages
– for ITEM in SEQUENCE:

Chapter 4 Modularity

def – used to make functions; return keyword; functions without return returns None, but we don't
represent None so we don't see anything; REUSABLE functions;
distinguishing between module import and module execution***
special attributes in python are delimited by double underscores; __name__ evaluetes to “__main__” or
the actual module name depending on how the enclosing module is being used; module code is only
executed once on first import **; python execution model → When are functions defined, What
happens when python module is imported;

python module → convenient import with API


python script → convenient execution from command line
python program → perhaps composed of many modules

never import * → it can lead to namespace clashes

advanced command line argument parsing: python standard library → argparse or many third party
options such as → docopt
documenting your code using docstrings
“”” put below functions“””
Zen Moment
Sparse is better than dense. Two between the functions, that is the number of lines, PEP8 recommends.

pep257 → docstring conventions → not widely used


reStructuredText/Sphinx → making html documents out of docstrings
We can access docstrings with the help on the REPL help(your_function)
documenting code with comments
shebang on top of the code #!/usr/bin/env python3
pylauncher**
#!/usr/bin/env python3 → allows the program loader to idenfitify which interpreter should be used

Chapter Summary:

– Python code is places in *.py files called “modules”


– modules can be executed directly with python3 module_name.py
– Brought into the REPL or other modules with import module_name
– Named functions defined with the def keyword: def function_name(arg1, argn)
– Return from functions using return keyword with optional parameter
– Omitted return parameter or implicit return at end returns None
– Use __name__ to determine how the module is being used
– If __name__ is __main__ the module is being executed
– Module code is executed exactly once, on first import
– def is a statement which binds a function definition to a name
– Command line arguments are accessible through sys.argv
– The script filename is in sys.argv[0]
– Docstrings are standalone literal string as the first statement of a function or module
– Docstrings are delimited by triple quotes
– Docstrings provide help()
– Module docstrings should precede other statements
– Comments begin with # and run to the end of the line
– A special comment on the first line beginning #! controls module execution by the program
loader

Chapter 5 Objects

variables are references to objects; integer objects are immutable; y = x; no copy of data, just copy of
reference; id() → returns a unique indentifier for an object; garbage collectio; id() deals with the object
not the reference; Variables are named references to objects***; value equality vs identity; value –
equivalent “contents”; identity – same objects; value comparison can be controlled programatically;
pass by object reference(arguments) → the value of the reference is copied not the value of the object.
Default arguments; def function(a, b=value) → keyword and positional arguments; default argument
values are evaluated when def is evaluated. They can be modified like any other object; Python –
dynamic and strong typing; Javascript – dynamic and weak typing;
Type systems:
dynamic type system – object types are only resolved at runtime
strong type system – there is no implicit type conversion

Zen Moment
Special cases aren't special enough to break the rules. We follow patterns, not to kill complexity, but to
master it.

variable scoping → object references have no type; scopes are contexts in which named references can
be looked up; local → inside a current function; enclosing → any and all enclosing functions;
global → top level of module
built-in → provided by the builtins module
LEGB rule → names are looked in narrowest path; global → rebinds a global name at module scope
In Python, everything is an OBJECT.; type(); dir(object) → used to see attributes of an object

Chapter Summary:

– Think of a named references to objects rather than variables


– assignment attaches a name to an object
– assigning from one reference to another puts two name tags on the same object
– The garbage collector reclaims unreachable objects
– id() returns a unique and constant identifier
– The is operator determines equality of identity
– Test for equivalence using ==
– Function arguments are passed by object-reference
– functions can modify mutable arguments
– Reference is lost if a formal function argument is rebound
– To change a mutable argument, replace its contents
– Function arguments can be specified with defaults
– Default argument expressions evaluated once, when def is executed
– Python uses dynamic typing
– We don't specify types in advance
– Python uses strong typing
– Types are not coerced to match
– Names are looked up in four nested scopes
– LEGB rule → Local, Enclosing, Global and Built-Ins
– Global References can be read from a local scope
– Use global to assign to global references from a local scope
– Everything in Python is object
– This includes modules and functions
– They can be treated just like other objects
– import and def result in binding to named references
– type can be used to determine the type of an object
– dir() can be used to introspect an object and get its attributes
– the name of a function or modue object can be accessed through its __names__ attribute
– The docstring for a function or module can be accessed throught its __doc__ attribute
– Use len() to measure the length of a string
– You can multiply a string by an integer
– Produces a new string with multiple copies of the operand
– This is called the repetition operation

Chapter 6 Collections

str list dict tuple range set

tuple – heterogeneous immutable sequence ( ) ; tuples can contain any type of object, nested tuples; k =
(391, ) → single element tuple; tuple unpacking → useful
strings – homogeneous immutable sequence of unicode codepoints(characters); len(). /join split
partition format etc.
range → arithmetic progression of integers; prefer enumerate() for counters
list → heterogeneous mutable sequence; slicing, full slice [:] → used for copying a list; index(item), del
seq[index], insert, reverse() reversed() sort() sorted()
shallow copies*** → list repetition is shallow
dictionary → unordered mapping from unique immutable keys to mutable values; dict() constructor
accepts: iterable series of key-value 2-tuples ; d.copy() → for copying dictionaries; dict(d) → pass
existing dictionary to dict(); extend a dictionary with update(); .values .keys
pprint module → pretty printing
set → unordered collection of unique immutable objects; {} to create empty set use set() constructor;
membership → in and not in; set algebra operations(unions, etc); s.union/intersection/difference etc.
Collection Protocols: - protocol implementing collections
protocols → to implement a protocol bjects must support certain operations; most collections
implement a container, sized and a iterable
container → in and not in
sized → len()
iterable → can produce an iterator with iter(s)
sequence → retrieve elements by index

Zen Moment
The way may not be obvious at first. To concatenate, invoke join on empty text. Something for nothing.

Chapter Summary:

– tuples are immutable sequence types


– literal syntax: optional parentheses around a comma separator
– single element tuples must use a trailing comma
– tuple unpacking – return values and idiomatic swap
– strings are immutable sequence types of Unicode codepoints
– string concatenation is most efficiently performed with join() on empty separator
– the partition() method is useful and elegant string parsing tool
– the format() method provides a powerfu way of replacing placeholders with values
– ranges represent integer sequences with regular intervals
– ranges are arithmetic progressions
– the enumerate() function is often a superior alternative to range()
– Lists are heterogeneous mutable sequence types
– negative indexes work backwards from the end
– slicing allows us to copy all or part of a list
– the full slice is a common idiom for copying lists, although the copy() method and list()
constructor is less obscure
– list and other collection copies are shallow
– list repetition is shallow
– dictionaries map immutable keys to mutable values
– iteration and membership testing is done with respect to the keys.
– Order is arbitrary
– the keys(), values(), and items() methods provide views onto different aspects of a dictionary,
allowing convenient iteration
– set store an unordered collection of unique elements
– sets support powerful and expressive set algebra operations and predicates
– protocols such as iterable, sequence and container characterise the collections

Chapter 7 Handling expressions

exception handling is a mechanism for stopping “normal” program flow and continuing at some
surrounding context or code block
raise an exception to interrupt program flow → handle an exception to resume control → unhandled
exceptions will terminate the program → exception objects contain information about the exceptional
event; exceptions and control flow; different programming errors**;
IMPRUDENT → not showing care for consequences of an action; rash;
Caller need to know what exceptions to expect and when; use exceptions that users will anticipate;
standard exceptions are always the best choice; exceptions are parts of families of related functions that
are referred to as “protocols”; use common or existing exception types when possible; avoid protecting
against TypeError; Easier to ask forgiveness than permissions → in favor vs look before you leap
resource cleanup with finally → try...finally lets you clean up wheter an exception occurs or not
Errors are like bells, and if we make them silent they are of no use
platform specific modules → Windows – msvcrt, Linux/osX → sys, tty, termios

Chapter Summary:

– raising an exception interrupts normal program flow and transfers control to an exception
handler
– exception handlers defined using the try...except construct
– try blocks define a context for detecting exceptions
– corresponding except blocks handle specific exception types
– python uses exceptions pervasively
– many built-in language features depend on them
– except blocks can capture an exception which are often of a standard type
– programmer errors should not normally be handled
– exceptional conditions can be signaled using raise
– raise without an argument re-raises the current exception
– generally do not check for TypeErrors
– Exception objects can be converted to strings using str()
– A functions exceptions for part of its API
– the should be documented properly
– prefer to use built-in exception types when possible
– use the try...finally construct to perform cleanup actions
– may be used in conjuction with except blocks
– Output of print() can be redirected using the optional file argument
– use and and or for combining boolean expressions
– return codes are too easily ignored
– platform-specific actions can be implemented using EAFP along with catching ImportErrors

Chapter 8 Iterables

Iterable Objects**; comprehensions → concise syntax for describing lists, sets or dictionaries in
declarative or functional style; list comprehensions [len(word) for word in words]; set comprehensions
{expr(item) for item in iterable}; dictionary comprehensions{key_expr:value_expr for item in iterable}
dont cram too much complexity into comprehensions; [x for x in range(101) if is_prime(x)] → filtering
predicates?* ; optional filtering clause [expr(item) for item in iterable if predicate(item)]; code is
written once but read over and over. Fewer is clearer but don't overuse comprehensions; iteration
protocols → iterable protocol; iterator protocol;
Iterable protocol → Iterable objects can be passed to the built-in iter() function to get an iterator;
iterator = iter(iterable)
Iterator protocol → iterator objects can be passed to the built-in next() function to fetch the next item
item = next(iterator)
generator functions → those that have at least one yield → most powerful feature in language
Generator functions in Python → specify iterable sequences; all generators are iterators
are lazily evaluated → the next value in the sequence is computed on demand; are composable into
pipelines → for natural stream processing; yield keyword; next(s) → next value;
Stateful Geneartors
– generators resume execution
– can maintain state in local variables
– complex control flow
– lazy evaluation
Laziness and the Infinite
– Just in Time computation
– Infinite or large sequences → sensor readings, mathematical series, massive files
Generator comprehensions
– similar syntax to list comprehensions
– create a generator objec
– concise
– lazy evaluation
(expr(item) for item in iterable)
Batteries included → Iteration Tools → itertools
Chapter Summary:

– Comprehensions
– Comprehensions are a concise syntax for describing lists, sets, and dictionaries
– Comprehensions operate on an iterable source object and apply an optional predicate filter and
a mandatory expression, both of which are usually in terms of the current item
– iterables are objects over which we can iterate item by item
– we retrieve an iterator from an iterable object using the built-in iter() function
– Iterators produce items one-by-one from the underlying iterable series each time they are passed
to the built-in next() function
– Generators
– Generator functions allow us to describe series using imperative code
– Generator functions contain at least one use of the yield keyword
– Generators are iterators. When advanced with next() the generator starts or resumes execution
up to and including the next yield
– each call to generator function creates a new generator object
– Generators can maintain explicit state in local variables between iterations
– Generators are lazy, and so can model infinite series of data
– Generator expressions have a similar syntactic form to list comprehensions and allow for a
more declarative and concise way of creating generator objects
– Iteration tools – Built-ins such as sum() any() zip() all() min() max() enumerate()
– Standard library itertools module – chain() islice() count() and many more!

Chapter 9 Classes

You can get a long way with python's built-in types; but when they're not right for the job, you can use
classes to create custom types; Classes define the structure and behaviour of objects; An object's class
controls its initialization; Classes make complex problems tractable. Classes can make simple solutions
overly complex; Python lets you find right balance; Class Names use CamelCase; Method → function
defined with a class; Instance method → functions which can be called on objects; self → the first
argument to all instance methods; f = Flight(); f.number == Flight.number(f);
Initializers; __init__() → instance method for initializing new objects → is an initializer not a
constructor; self is similar to “this” in java or c++; implementation details start with _
public, private, protected*** → we're all adults here...
Class invariants** → truths about an object that endure for its lifetime
Law of Demeter → Never call methods on objects you get from other calls. Only talk to your friends.

Zen Moment
Complex is better than complicated. Many moving parts combined in a clever box are now one good
tool.
Dont feel compelled to create classes without good reason. Tell objects what to do, don't ask for their
state and base your actions on that.; \ → used for breaking long lines of code;
Polymorphism → using objects of different types through a common interface
DuckTyping → When I see a bird that walks like a duck and swims like a duck and quack like a duck, I
call that bird a duck. - James William Riley
Inheritance – A sub-class can derive from a base-class, inheriting its behaviour and making behaviour
specific to the Sub-class. → python uses late binding.
In Python, inheritance is most useful for sharing implementation.

Chapter Summary:

– All types in Python have a 'class'


– Classes define the structure and behaviour of an object
– Class is determined when object is created
– normally fixed for the lifetime
– Classes are key support for Object-Oriented Programming in Python
– Classes defined using the class keyword followed by CamelCase name
– Class instances created by calling the class as if it were a function
– Instance methods are functions define inside the class
– should accept an object instance called self as the first parameter
– Methods are called using instance.method()
– syntactic sugar for passing self instance to method
– the optional __init__() method initialized new instances
– if present, the constructor calls __init__()
– __init__() is not constructor
– Arguments Passed to the constructor are forwareded to the initializer
– Instance attributes are created simply by assigning to them
– Implementation details are denoted by a leading underscore
– There are no public, protected or private access modifiers in Python
– Accessing implementation details can be very useful
– especially during development and debugging
– class invariants should be established in the initializer
– if the invariants can't be established raise exceptions to signal failure
– methods can have docstrings, just like regular functions
– classes can have docstrings
– even within an object method calls must be preceded with self
– You can have as many classes and functions in a module as you wish
– related classes and global functions are usually grouped together this way
– polymorphism in python is achieved through duck typing
– polymorphism in python does not use shared base classes or interfaces
– class inheritance is primarily useful for sharing implementation
– All methods are inherited, including special methods like the initializers
– Strings support slicing, because the implement the sequence protocol
– Following the Law of Demeter can reduce coupling
– We can nest comprehensions
– It can sometimes be useful to discard the current item in a comprehension
– When dealing with one-based collections it's often easier just to waste one list entry
– Don't feel compelled to use classes when a simple function will suffice
– Comprehensions or generator expression can be split over multiple lines
– Statements can be split over multiple lines using backslash
– use this feature sparingly and only when it improves readability
– Use “Ask! Don't tell” to avoid tight coupling between objects
Chapter 10 Files and Resource Managment

open() → open a file, mode and encoding; text file access → is encoded; sys.getdefaultencoding()
open() → modes**; write() → returns the number of codepoints not the number of bytes
read(bytes); seek(offset); readline(), writelines(); files as iterators; Typical file usage:
f = open()
# work work work
f.close() → is required to actually write the data
with-block → resource cleanup with context-managers
open() - returns a context-manager

with EXPR as VAR:


BLOCK

binary files → device independent bitmaps


bitwise operators; fractal images; reading binary files; file-like objects – loosely defined set of
behaviours for things that act like files

Chapter Summary:

– Files are opened usng the built-in open() function which accepts a file mode to control
read/write/append behaviour and wheter the file is to be treated as raw binary or encoded text
data
– For text data you should specify a text encoding
– text files deal with string objects and perform universal newline translation and string encoding
– binary files deal with bytes objects with no newline translation or encoding
– when writing files, it's our responsibility to provide newline characters for line breaks
– files should always be closed after use
– files provide various line-oriented methods for reading and are also iterators which yield line by
line
– files are context managers and the with-statement can be used with content managers to ensure
that cleanup operations, such as closing files are performed
– The notion of file-like objects is loosely defined, but very useful in practice
– Exercise EAFP to make most of them
– Context managers aren't restricted to file-like-objects. We can use tools in the contextlib
standard library module, such as closing() wrapper to create our own context managers
– help() can be used on instance objects, not just types
– Python supports bitwise operators &, | and left and right-shifts.

Chapter 11 Shipping working and maintainable code

unittest-> unit tests, integration tests, acceptance tests


Testcase → groups together related test functions → basic unit of organization in unittest
fixtures → code run before and/or after each test function
assertions → specific tests for conditions and behaviours
TDD → test driven development
setUp, tearDown → methods
PDB → The Python DeBugger
virtual environment → light-weight self-contained python installation
pyenv – virtualenv; packaging code** installing third party modules → distutils, easy_install, pip

Zen Moment
In the face of ambiguity, refuse the temptation to guess. To guess is to know, that you have left
something out. What are you missing?

Chapter Summary :

– unittest is a framework for developing reliable automated tests


– You define tests cases by subclassing from unittest.TestCase
– unittest.main() is useful for running all of the tests in a module
– setUp() and tearDown() run code before and after each test method
– Test methods are defined by creating method names that start with test_
– TestCase.assertRaises() in a with-statement to check that the right exceptions are thrown in a
test
– Python's Standard debugger is called PDB
– PDB is a standard command-line debugger
– pdb.set_trace() can be used to stop program execution and enter the debugger
– Your REPL's prompt will change to (Pdb) when you're in the debugger
– You can access PDB's built-in help system by typing help
– use “python -m pdb <script name>” to run a program under pdb from the start
– PDB's where command shows the current call stack
– PDB's next command lets execution continue to the next line of code
– PDB's continue command lets program execution continue indefinitely until you stop it with
control-c
– PDB's list command shows you the source code at your current location
– PDB's return command resumes execution until the end of the curren function
– PDB's print command lets you see the values of objects in the debugger
– Use quit to exit PDB
– Virtual environments are light-weight, self-contained Python installations that any user can
create
– pyenv is the standard tool for creating virtual environments
– pyenv accepts both a source-installation argument as well as a directory name into which it
creates the new environment
– to use a virtual environment, you need to run its activate script
– when you activate a virtual environment, your prompt is modified to remind you
– the distutils package is used to help you distribute your python code
– distutils is generally used inside a setup.py script which users run to install your software
– the main function in distutils is setup()
– setup() takes a number of arguments describing both the source file as well as metadata for the
code
– the most common way to use setup.py is to install code using python setup.py install
– setup.py can also be used to generate distributions of your code
– Distributions can be zip files, tarballs, or several other formats
– pass –help to setup.py to see all of its options
– Three common tools for installing third-party software are distutils, easy_install and pip
– The central repository for Python packages is the Python Package Index, also called PyPI or
“cheeseshop”
– You can install easy_install by downloading and running distribute_setup.py
– You use easy_install to install modules by running easy_install package-name from the
command line
– You can install pip via easy_install
– To install modules with pip, use the subcommand notation pip install package-name
– divmod() calculates the quotient and remainder for a division operation at one time
– reversed() function can reverse a sequence
– You can pass -m to your Python command to have it run a module as a script
– Debugging makes it clear that python is evaluating everything at runtime
– You can use the __file__ attribute to find out where its source code is located
– Third-party python is generally installed into your installation's site packages directory
– nose is a useful tool for working with unittest-based tests

Pluralsight – Python Beyond the Basics

Chapter 1 Organizing Larger Programs

packages → a module which can contain other modules → adding structure to program; module →
single source code file; packages are generally directories; urllib.__path__ ; sys.path → list of
directories Python searches for modules; sys.path.append(' '); PYTHONPATH – Environment variable
listing; paths added to sys.pat; export PYTHONPATH=' '; reader.__file__ ; subpackaging →
python3 – m reader.compressed.gzipped test.gz data compressed with gzip

Package review
1. Packages are modules that contain other modules
2. Packages are generally implemented as directories containing a special __init__.py file
3. The __init__.py file is executed when the package is imported
4. Packages can contain subpackages which themselves are implemented with __init__.py files in
directories; absolute imports; relative imports → imports which use a relative path to moduless
in the same package
. → same directory ; .. → parent directory ; Relative imports :
1. Can reduce typing in deeply nested package structures
2. Promote certain forms of modifiability
3. Can aid package renaming and refactoring
4. General advice is to avoid them in most cases
__all__ - list of attribute names imported via from module import *
locals() → looks which modules are imported in interpreter; from module import * → The __all__
attribute should be a list of strings containing names available in the module; → protecting imports
– namespace packages-> packages split across several directories → useful for splitting large
packages into multiple parts PEP420; namespace packages have no __init__.py → this avoids
complex initialization ordering problems; How does python find namespace packages? :
1. Python scans all entries in sys.path
2. If a matching directory with __init__.py is found a normal package is loaded
3. Otherwise, all matching directories in sys.path are considered part of the namespace package

executable directories → directories containing an entry point for Python execution; executable
zip files → zip file containing and entry point for Python execution; singleton pattern; modules
as singleton; recommended project structure :

Chapter Summary:

Chapter 2 Beyond Basic Functions

Function review; functions → module global scope; methods → class arguments; default argument
value; function objects are callable objects; Callable instances : __call__(); timeit module →
functions(math) that save states in between calls; Classes are callable → calling the class invokes the
constructor; Conditional expressions → PEP308 Lambdas → Alonzo Church → lambda is expression
which results in callable object;
def :

1. Statement which defines a function and binds it to a name


2. Must have a name
3. Arguments delimited by parentheses
4. Zero or more arguments supported; 0 args = ( )
5. Body is an indented block of statements
6. A return statement is required to return anything other than None
7. Regular functions can have docstrings
8. Easy to access for testing

Lambdas :

1. Expression which evaluates to a function


2. Anonymous
3. Argument list terminated by colon separated by commas ,
4. Zero or more arguments supported → zero args = lambda :
5. body is a single expression
6. The return value is given by the body expression,No return statements is permitted
7. Lambdas cannot have docstrings
8. Awkward or impossible to test

Detecting callable objects BIF callable(); extended formal argument syntax → formal arguments;
argumetns at the function definition site(*args) positional arguments; keyword arguments(**kwargs);
Extended call syntax → extended actual argument syntax; actual arguments → arguments at the
function call site; Forwarding arguments; pprint module; Transposition***;
Chapter Summary:
Chapter 3 Closures and Decorators

Local Functions → define functions inside functions; LEGB rule; useful for: specialized one-off
functions, aid in code organization and readability; similar to lambdas but more general; may contain
multiple expressions, may contain statements; returning functions from functions → first class
functions; functions can be treated like any other object; clojures and nested scopes → closures –
maintain references to objects from earlier scopes; function factories → functions that returns new
specialized functions → combination of runtime function definition and clojures makes this possible
The nonlocal keyword → LEGB does not apply when making new bindings; global – introduce names
from global namespace into local namespace; you get a syntax error if name doesn't exists; function
decorators → missing; A first decorator example @escape_unicode; what can be a decorator? → we've
seen functions as decorators, but other objects can be decorators as well; class decorators; instance as
decorators → decorating with instance calls the instance; multiple decorators; @ => on separate line
above function → processed in reverse order; Decorating methods; functools.wrap() → naive decorator
can lose important metadata; properly update metadata on wrapped functions → functools library;
decorators are powerful tool, widely used in python; don't overuse decorators!;

Chapter Summary:
Chapter 4 Properties and class methods

Class Attributes → class attributes vs instance attributes; assignment to attributes → self.attr =


something → always create an instance attribute, never a class attributes; static methods → with the
@staticmethod decorator statis is a relic terminology from C and C++; Class methods → with
@classmethod decorators; Choosing @staticmethod – Two access needed to either class or instance
objects; most likely an implementation detail of the class; may be able to be moved to become a
module – scope function; or choosing @classmethod → requires access to the class object to call other
class methods of the constructor; class methods for named constructors; static methods with inheritance
class methods with inheritance; properties → encapsulation using the @property decorators;
python =! java → using getters and setters → deeply unpythonic; property can be used as an attribute
properties and inheritance → inheritance interaction with the @property decorator; chained relational
operators; “ All problems in Computer Science can be solved by another layer of INDIRECTION” →
David Wheeler;
Chapter Summary:

Chapter 5 String Representations

str(), repr() → Two string representation functions for making string representations from Python
object __str()__ and __repr()__; repr() → BIF that produces an unambigous string representation of an
object → repr() → representation; exactness is more important than human friendliness; suited for
debugging; includes identifying infomration; generally best for logging more info than str; repr is for
developers; str is for clients; import pdb; as a rule you should always write a repr() for your classes;
default rpr is not very helpful; str() → human friendly representation of an object; not programmer
oriented; str() is constructor fo str type(); when are the representations used → print uses str by default,
str() simply calls repr() but repr() doesn't call str(); repr() is used when showing elements of a
collection; Interaction with format() → special __format__ method; reprlib → supports alternative
implementations of repr(); ascii(), ord() and chr() → ascii replaces non ascii characters with escape
sequences, ord() → converts a single character to its integer Unicode point; chr() → takes integer
codepoint and returns unicode character
Chapter Summary:

Chapter 6 Numeric and Scalar types

reviewing the int and float; sys.float_info(); decimal module and the decimal type; rational numbers
with fraction types, complex type and the cmath module; → complex numbers; built in numeric
functions abs() and rand(); abs() gives distance from zero; round() - performs decimal rounding for all
scalar number types; number base conversions → bin(), oct(), hex(); the datetime module and date type
=> date, time, datetime, timedelta, timezone; the time type, the datetime type; timedelta → durations
with the timedelta type; arithmetic with datetime, timezones;
Chapter Summary:

Chapter 7 Iterables and Iteration

Multi input comprehensions → comprehensions → short-hand syntax for creating collections and
iterable objects → lists, dicts, set, generators → comprehensions can use multiple input sequences and
multiple if clauses; container populated “atomically” → allows python to optimize creation; more
readable; nested comprehensions → comprehensions can bbe nested inside other comprehensions;
the map() function → apply a function to every element in a sequence, producing a new sequence
map(ord, string) ; multiple input sequences → map() can accept any number of input sequences → the
number of input sequences must match the number of function arguments; map() vs comprehensions;
The filter() fucntion → apply a function to each element in a sequence, constructing a new sequence
with the elements with the elements for which the function returns True; the functools.reduce() -
repeatedly apply a function to the elements of a sequence, reducing them to a single value → function
prgoramming fold c++ STL ::accumulate(); optional initial value is conceptually just added to the start
of the input sequence; combining map() and reduce() → map-reduce algorithm; Iteration protocols;
iter() - create an iterator; next() - get next element in a sequence; StopIteration – signal the end of the
sequence; iterable – an object which implements __iter__() method; iterator – an object which
implements the iterable protocol and which implements the __next__ method; putting the protocols
together; alternative iterable protocol → works with any object that supports consecutive integer
indexing via __getitem__(); extended iter() format → iter(callable, sentinel); extended iter() is often
used for creating infinite sequences from existing functions.
Chapter Summary:
Chapter 8 Inheritance and Subtype Polymorphism

Inheritance overview → single inheritance → class SubClass(BaseClass); subclasses will want to


initialize base classes; base class initializer will only be called automatically if subclass initializer is
undefined; other languages automatically call base class initializers; python treats __init__() like any
other method; base class __init__() is not called if overriden; use super() to call base class __init__()
A realistic example SortedList super() used to access base-class implementation; The BIF isinstance()
function → multiple inheritance in python is not much more complex than single inheritance;
isinstance() → determines if an object is of a specified type; BIF issubclass() → determines if one type
is a subclass of another; Multiple Inheritance – defining a calss with more than one base class; class
SubClass(Base1, Base2,..) - subclasses inherit methods from all bases; without confilic, names resolve
in the obvious way; Method Resolution Order (MRO) determines name lookup in all cases; details of
multiple inheritance; if a class has multiple base classes, defines no initializer then only the initializer
of the first base calss is automatically called __bases__ - a tuple of base classes; mro → ordering that
determines method name lookup; methods may be defined in multiple places; mro is an ordering of the
inheritance graph, actually very simple; obj.method() mro(); mro is calculated with C3 algorithm;
subclasses come before base classes; base class order from class definition is preserved; first two ???
are preserved no matter where you start in the inheritance graph – not all inheritance declarations are
allowed; the BIF super() → given a MRO and a class C, super gives you an object which resolves
methods using only the part of the MRO which comes after C; super returns a proxy object which
routes method calls; bound proxy – bound to a specific class or instance; unbound proxy – not bound to
a class or instance; There are two types of bound proxies; instance-bound and class-bound; callin
super() without arguments; SortedInt list explained; the object class – object → the core of the python
object model; object is the ultimate base class of every class; object is automatically added as a base
class; nominally typed language; in python inheritance is a way to share implementation;

Chapter Summary:

Chapter 9 Implementing Collections

collection protocol overview → collections, tuple, str, range, list, dict, set; → to implement a protocol
objects must support certain operations; most collections implement container sized and iterable; all
except dict and set are sequences; collection construction → TDD, test code design(refactor) – the
construction connection; the container protocol __contains__(item); the sized protocol - __len__(); the
iterable protocol __iter__(); the sequence protocol slicing; comprehensible test results with __repr__()
implementing equality and inequality; equality vs identity test; the sequence protocol reversing; the
sequence protocol index(); the sequence protocol count(); improving performance rom O(n) to O(log n)
Refactoring to avoid DRY – Dont Repeat Yourself; Checking Protocol Implementation; The sequence
protocol concatenation and repetition; the set protocol;
Chapter Summary:

Chapter 10 Exceptions and Errors

Always specify an exception type → avoid bad practices in python, exception handling; The Standard
Exception hierarchy → exceptions are arranged in an inheritance hierarchy; exception payloads;
defining new exceptions; chaining exceptions; traceback objects; assertions internal invariants;
assertions class invariants; assertions performance;
Chapter Summary:

Chapter 11 Defining Context Managers

context manager → an object designed to be used in a with-statement; - a context-manager ensures


that resources are properly and automatically managed; the context managing protocol; a first context
manager example; __enter__() - called before entering with-statement body; return value bound to as
variable; - can return value of any type; __exit__() - called when with-statement exites;
__exit__ and exception propagation; the with-statement expansion PEP343; contextlib, context
manager → std lib for working with context-managers; multiple content managers; dont pass a list
Chapter Summary:
Chapter 12 Introspection
Chapter Summary:

The Zen of Python, by Tim Peters

Beautiful is better than ugly.


Explicit is better than implicit.
Simple is better than complex.
Complex is better than complicated.
Flat is better than nested.
Sparse is better than dense.
Readability counts.
Special cases aren't special enough to break the rules.
Although practicality beats purity.
Errors should never pass silently.
Unless explicitly silenced.
In the face of ambiguity, refuse the temptation to guess.
There should be one-- and preferably only one --obvious way to do it.
Although that way may not be obvious at first unless you're Dutch.
Now is better than never.
Although never is often better than *right* now.
If the implementation is hard to explain, it's a bad idea.
If the implementation is easy to explain, it may be a good idea.
Namespaces are one honking great idea -- let's do more of those!

You might also like