numpy.ptp() are used to play an important role in statistics by defining a range of given numbers. Range = maximum value — minimum value.
Syntax: ndarray.ptp (axis = None, out = None)
arr: input array.
axis: axis along which we want the range value. Otherwise, it will consider arr to be flattened (works on all the axis). axis = 0 means along the column and axis = 1 means working along the row.
out: [ndarray, optional] Different array in which we want to place the result. The array must have same dimensions as expected output.
Return: Range of the array (a scalar value if axis is none) or array with range of values along specified axis. p>
Code # 1: Work
arr: [1, 2, 7, 20, nan] Range of arr: nan arr: [1, 2, 7, 10, 16] Range of arr: 15
Code # 2:
arr: [[14, 17, 12, 33, 44], [15, 6, 27, 8, 19], [23, 2, 54, 1, 4]] Range of arr, axis = None: 53 Range of arr, axis = 0: [9 15 42 32 40] Min of arr, axis = 1: [32 21 53]
Code # 3: p >
Initial arr1: [0 1 2 3 4] Changed arr1 (having results): [9 15 42 32 40]
I noticed that the Python 2.7 documentation includes yet another command-line parsing module. In addition to
optparse we now have
Why has yet another command-line parsing module been created? Why should I use it instead of
optparse? Are there new features that I should know about?
If the array contains both positive and negative data, I"d go with:
import numpy as np a = np.random.rand(3,2) # Normalised [0,1] b = (a - np.min(a))/np.ptp(a) # Normalised [0,255] as integer: don"t forget the parenthesis before astype(int) c = (255*(a - np.min(a))/np.ptp(a)).astype(int) # Normalised [-1,1] d = 2.*(a - np.min(a))/np.ptp(a)-1
If the array contains
nan, one solution could be to just remove them as:
def nan_ptp(a): return np.ptp(a[np.isfinite(a)]) b = (a - np.nanmin(a))/nan_ptp(a)
However, depending on the context you might want to treat
nan differently. E.g. interpolate the value, replacing in with e.g. 0, or raise an error.
Finally, worth mentioning even if it"s not OP"s question, standardization:
e = (a - np.mean(a)) / np.std(a)
import keyring # the service is just a namespace for your app service_id = "IM_YOUR_APP!" keyring.set_password(service_id, "dustin", "my secret password") password = keyring.get_password(service_id, "dustin") # retrieve password
Usage if you want to store the username on the keyring:
import keyring MAGIC_USERNAME_KEY = "im_the_magic_username_key" # the service is just a namespace for your app service_id = "IM_YOUR_APP!" username = "dustin" # save password keyring.set_password(service_id, username, "password") # optionally, abuse `set_password` to save username onto keyring # we"re just using some known magic string in the username field keyring.set_password(service_id, MAGIC_USERNAME_KEY, username)
Later to get your info from the keyring
# again, abusing `get_password` to get the username. # after all, the keyring is just a key-value store username = keyring.get_password(service_id, MAGIC_USERNAME_KEY) password = keyring.get_password(service_id, username)
Items are encrypted with the user"s operating system credentials, thus other applications running in your user account would be able to access the password.
To obscure that vulnerability a bit you could encrypt/obfuscate the password in some manner before storing it on the keyring. Of course, anyone who was targeting your script would just be able to look at the source and figure out how to unencrypt/unobfuscate the password, but you"d at least prevent some application vacuuming up all passwords in the vault and getting yours as well.
To read user input you can try the
cmd module for easily creating a mini-command line interpreter (with help texts and autocompletion) and
input for Python 3+) for reading a line of text from the user.
text = raw_input("prompt") # Python 2 text = input("prompt") # Python 3
Command line inputs are in
sys.argv. Try this in your script:
import sys print (sys.argv)
There are two modules for parsing command line options:
(deprecated since Python 2.7, use
argparse instead) and
getopt. If you just want to input files to your script, behold the power of
The Python library reference is your friend.
An example (listing the methods of the
>>> from optparse import OptionParser >>> import inspect #python2 >>> inspect.getmembers(OptionParser, predicate=inspect.ismethod) [([("__init__", <unbound method OptionParser.__init__>), ... ("add_option", <unbound method OptionParser.add_option>), ("add_option_group", <unbound method OptionParser.add_option_group>), ("add_options", <unbound method OptionParser.add_options>), ("check_values", <unbound method OptionParser.check_values>), ("destroy", <unbound method OptionParser.destroy>), ("disable_interspersed_args", <unbound method OptionParser.disable_interspersed_args>), ("enable_interspersed_args", <unbound method OptionParser.enable_interspersed_args>), ("error", <unbound method OptionParser.error>), ("exit", <unbound method OptionParser.exit>), ("expand_prog_name", <unbound method OptionParser.expand_prog_name>), ... ] # python3 >>> inspect.getmembers(OptionParser, predicate=inspect.isfunction) ...
getmembers returns a list of 2-tuples. The first item is the name of the member, the second item is the value.
You can also pass an instance to
>>> parser = OptionParser() >>> inspect.getmembers(parser, predicate=inspect.ismethod) ...
Basically, you have 5 steps:
def find_paws(data, smooth_radius=5, threshold=0.0001): data = sp.ndimage.uniform_filter(data, smooth_radius) thresh = data > threshold filled = sp.ndimage.morphology.binary_fill_holes(thresh) coded_paws, num_paws = sp.ndimage.label(filled) data_slices = sp.ndimage.find_objects(coded_paws) return object_slices
Blur the input data a bit to make sure the paws have a continuous footprint. (It would be more efficient to just use a larger kernel (the
structure kwarg to the various
scipy.ndimage.morphology functions) but this isn"t quite working properly for some reason...)
Threshold the array so that you have a boolean array of places where the pressure is over some threshold value (i.e.
thresh = data > value)
Fill any internal holes, so that you have cleaner regions (
filled = sp.ndimage.morphology.binary_fill_holes(thresh))
Find the separate contiguous regions (
coded_paws, num_paws = sp.ndimage.label(filled)). This returns an array with the regions coded by number (each region is a contiguous area of a unique integer (1 up to the number of paws) with zeros everywhere else)).
Isolate the contiguous regions using
data_slices = sp.ndimage.find_objects(coded_paws). This returns a list of tuples of
slice objects, so you could get the region of the data for each paw with
[data[x] for x in data_slices]. Instead, we"ll draw a rectangle based on these slices, which takes slightly more work.
The two animations below show your "Overlapping Paws" and "Grouped Paws" example data. This method seems to be working perfectly. (And for whatever it"s worth, this runs much more smoothly than the GIF images below on my machine, so the paw detection algorithm is fairly fast...)
Here"s a full example (now with much more detailed explanations). The vast majority of this is reading the input and making an animation. The actual paw detection is only 5 lines of code.
import numpy as np import scipy as sp import scipy.ndimage import matplotlib.pyplot as plt from matplotlib.patches import Rectangle def animate(input_filename): """Detects paws and animates the position and raw data of each frame in the input file""" # With matplotlib, it"s much, much faster to just update the properties # of a display object than it is to create a new one, so we"ll just update # the data and position of the same objects throughout this animation... infile = paw_file(input_filename) # Since we"re making an animation with matplotlib, we need # ion() instead of show()... plt.ion() fig = plt.figure() ax = fig.add_subplot(111) fig.suptitle(input_filename) # Make an image based on the first frame that we"ll update later # (The first frame is never actually displayed) im = ax.imshow(infile.next()) # Make 4 rectangles that we can later move to the position of each paw rects = [Rectangle((0,0), 1,1, fc="none", ec="red") for i in range(4)] [ax.add_patch(rect) for rect in rects] title = ax.set_title("Time 0.0 ms") # Process and display each frame for time, frame in infile: paw_slices = find_paws(frame) # Hide any rectangles that might be visible [rect.set_visible(False) for rect in rects] # Set the position and size of a rectangle for each paw and display it for slice, rect in zip(paw_slices, rects): dy, dx = slice rect.set_xy((dx.start, dy.start)) rect.set_width(dx.stop - dx.start + 1) rect.set_height(dy.stop - dy.start + 1) rect.set_visible(True) # Update the image data and title of the plot title.set_text("Time %0.2f ms" % time) im.set_data(frame) im.set_clim([frame.min(), frame.max()]) fig.canvas.draw() def find_paws(data, smooth_radius=5, threshold=0.0001): """Detects and isolates contiguous regions in the input array""" # Blur the input data a bit so the paws have a continous footprint data = sp.ndimage.uniform_filter(data, smooth_radius) # Threshold the blurred data (this needs to be a bit > 0 due to the blur) thresh = data > threshold # Fill any interior holes in the paws to get cleaner regions... filled = sp.ndimage.morphology.binary_fill_holes(thresh) # Label each contiguous paw coded_paws, num_paws = sp.ndimage.label(filled) # Isolate the extent of each paw data_slices = sp.ndimage.find_objects(coded_paws) return data_slices def paw_file(filename): """Returns a iterator that yields the time and data in each frame The infile is an ascii file of timesteps formatted similar to this: Frame 0 (0.00 ms) 0.0 0.0 0.0 0.0 0.0 0.0 Frame 1 (0.53 ms) 0.0 0.0 0.0 0.0 0.0 0.0 ... """ with open(filename) as infile: while True: try: time, data = read_frame(infile) yield time, data except StopIteration: break def read_frame(infile): """Reads a frame from the infile.""" frame_header = infile.next().strip().split() time = float(frame_header[-2][1:]) data =  while True: line = infile.next().strip().split() if line == : break data.append(line) return time, np.array(data, dtype=np.float) if __name__ == "__main__": animate("Overlapping paws.bin") animate("Grouped up paws.bin") animate("Normal measurement.bin")
Update: As far as identifying which paw is in contact with the sensor at what times, the simplest solution is to just do the same analysis, but use all of the data at once. (i.e. stack the input into a 3D array, and work with it, instead of the individual time frames.) Because SciPy"s ndimage functions are meant to work with n-dimensional arrays, we don"t have to modify the original paw-finding function at all.
# This uses functions (and imports) in the previous code example!! def paw_regions(infile): # Read in and stack all data together into a 3D array data, time = ,  for t, frame in paw_file(infile): time.append(t) data.append(frame) data = np.dstack(data) time = np.asarray(time) # Find and label the paw impacts data_slices, coded_paws = find_paws(data, smooth_radius=4) # Sort by time of initial paw impact... This way we can determine which # paws are which relative to the first paw with a simple modulo 4. # (Assuming a 4-legged dog, where all 4 paws contacted the sensor) data_slices.sort(key=lambda dat_slice: dat_slice.start) # Plot up a simple analysis fig = plt.figure() ax1 = fig.add_subplot(2,1,1) annotate_paw_prints(time, data, data_slices, ax=ax1) ax2 = fig.add_subplot(2,1,2) plot_paw_impacts(time, data_slices, ax=ax2) fig.suptitle(infile) def plot_paw_impacts(time, data_slices, ax=None): if ax is None: ax = plt.gca() # Group impacts by paw... for i, dat_slice in enumerate(data_slices): dx, dy, dt = dat_slice paw = i%4 + 1 # Draw a bar over the time interval where each paw is in contact ax.barh(bottom=paw, width=time[dt].ptp(), height=0.2, left=time[dt].min(), align="center", color="red") ax.set_yticks(range(1, 5)) ax.set_yticklabels(["Paw 1", "Paw 2", "Paw 3", "Paw 4"]) ax.set_xlabel("Time (ms) Since Beginning of Experiment") ax.yaxis.grid(True) ax.set_title("Periods of Paw Contact") def annotate_paw_prints(time, data, data_slices, ax=None): if ax is None: ax = plt.gca() # Display all paw impacts (sum over time) ax.imshow(data.sum(axis=2).T) # Annotate each impact with which paw it is # (Relative to the first paw to hit the sensor) x, y = ,  for i, region in enumerate(data_slices): dx, dy, dz = region # Get x,y center of slice... x0 = 0.5 * (dx.start + dx.stop) y0 = 0.5 * (dy.start + dy.stop) x.append(x0); y.append(y0) # Annotate the paw impacts ax.annotate("Paw %i" % (i%4 +1), (x0, y0), color="red", ha="center", va="bottom") # Plot line connecting paw impacts ax.plot(x,y, "-wo") ax.axis("image") ax.set_title("Order of Steps")
As of python
optparse is deprecated, and will hopefully go away in the future.
More information is also in PEP 389, which is the vehicle by which
argparse made it into the standard library.
This answer suggests
optparse which is appropriate for older Python versions. For Python 2.7 and above,
optparse. See this answer for more information.
As other people pointed out, you are better off going with optparse over getopt. getopt is pretty much a one-to-one mapping of the standard getopt(3) C library functions, and not very easy to use.
optparse, while being a bit more verbose, is much better structured and simpler to extend later on.
Here"s a typical line to add an option to your parser:
parser.add_option("-q", "--query", action="store", dest="query", help="query string", default="spam")
It pretty much speaks for itself; at processing time, it will accept -q or --query as options, store the argument in an attribute called query and has a default value if you don"t specify it. It is also self-documenting in that you declare the help argument (which will be used when run with -h/--help) right there with the option.
Usually you parse your arguments with:
options, args = parser.parse_args()
This will, by default, parse the standard arguments passed to the script (sys.argv[1:])
options.query will then be set to the value you passed to the script.
You create a parser simply by doing
parser = optparse.OptionParser()
These are all the basics you need. Here"s a complete Python script that shows this:
import optparse parser = optparse.OptionParser() parser.add_option("-q", "--query", action="store", dest="query", help="query string", default="spam") options, args = parser.parse_args() print "Query string:", options.query
5 lines of python that show you the basics.
Save it in sample.py, and run it once with
and once with
python sample.py --query myquery
Beyond that, you will find that optparse is very easy to extend. In one of my projects, I created a Command class which allows you to nest subcommands in a command tree easily. It uses optparse heavily to chain commands together. It"s not something I can easily explain in a few lines, but feel free to browse around in my repository for the main class, as well as a class that uses it and the option parser
It would be easy for me to develop native apps using Java, C++ or Objective-C and I am also able to learn Kotlin, Dart or Swift, but things are much easier when you just use Python. I have done a Djan...
Grokking Deep Learning teaches you to build deep learning neural networks from scratch! In his immersive style, deep learning expert Andrew Trask shows you the hidden science so you can uncover every ...
Coding for Kids: Python - Learn to Code with 50 Awesome Games and Activities. Learning to code isn't as difficult as it sounds, you just have to get started! Coding for Kids: Python gets kids start...
In the last decade, we have seen the impact of exponential advances in technology on the way we work, shop, communicate, and think. At the heart of this change is our ability to collect and gain insig...