Tuesday, October 4, 2016

Is a Central Bank or Bitcoin Right for the US?

Scrapbook #5: Does the US need a Central Bank?
https://news.bitcoin.com/jerry-brito-task-force-blockchain/

Summary of article:

Fifty (50) attorneys have joined forces to form another blockchain industry advocacy group. The Digital Currency and Ledger Defense Coalition (DCLDC) will focus its efforts on legal issues that surround the digital currency and blockchain environment.

Reason article was selected:

The Federal Reserve, our central bank, seems to be a key part of our financial system, but is it really necessary?  Should Bitcoin ever replace a national currency such as the US Dollar?

Personal/social values at stake:

I began my research into this matter by reviewing the history of central banking in the US, and, in very brief summary, here it is:  For much of our history, actually, there was none.  Treasury Secretary Alexander Hamilton and founding president James Madison's creation of the first central bank(s) in the US was contentious, opposed by other presidents, namely, Thomas Jefferson and, later, Andrew Jackson, who foresaw them as an engine of speculation, financial manipulation and corruption.  They prevented the renewal of the charters of the First and Second Banks of the US, respectively.

These early and short-lived First and Second Banks of the US had produced only a percentage of the country's paper money (also backed by gold and silver) along with the states.  Then, given the non-renewal mentioned, there was an era between 1837 and 1862 in which there was no central bank, just state banks.  These state banks lasted only five years on average due to their eventual failure to redeem customer notes, due to liquidity crises, panics and bank runs.  Moneys were variable across banks and usually uninsured, so there was a lot of uncertainty and loss.

After the Civil War, national banks were formed again with higher reserve standards (greenbacks too) and a uniform national currency, but there were still panics, one of the worst occurring in 1907.  Resultantly, in 1913, in an effort to stabilize the economy and prevent panics with a money creator of last resort, President Wilson signed the Federal Reserve Act into law.

During WWI, the Reserve issued war bonds to raise money.  And, since 1971, the US has been off the gold standard, which is pretty phenomenal.  Also, during this later time, the Fed was further charged by Congress to effectively promote the goals of maximum employment, stable prices and moderate long-term interest rates.

Today's twist in centralized vs. decentralized banking comes in the form of the blockchain ledger technology, which is the technical foundation of the Bitcoin cryptocurrency. Bitcoin, along with numerous other smaller cryptocurrencies, have gained favor in recent years due to an unlikely group of allied interests: those interested in cutting edge technology; others interested in moving money internationally; and still others wary of central banks and fiat currencies -- they've all a vested interest in seeing it succeed.

Recently a group of legal experts have teamed up to create what is known as DCLDC or Digital Currency and Ledger Defense Coalition to help defend the currency against legal challenges. The group likens themselves to the ACLU and EFF.

Bitcoin is the antithesis in some ways to a central bank; its legitimacy is not backed by any central authority.  Instead, its ledger and the shared interests of the “miners” who found the money are what keep it in check. This has profound implications, as there is no government group or bank backing the currency.

One could take a not too far-fetched leap then and wonder what the world would be like without any central authority for money, for what is money, simply, other than an exchange mechanism for goods and services? Considering the US existed for many years without one, it is not entirely without precedent.  However, ultimately, I conclude that it would be prohibitively volatile and risky without the full faith and credit of the US backing it, just as it was throughout history before it.

Creditability of the source:

Fast Company describes itself as a leading progressive, business media company, which was launched in November 1995 by Alan Webber and Bill Taylor, two formerHarvard Business Review editors.



Tuesday, September 27, 2016

Throw-away Society?

Scrapbook #4: Throw-away society?

PDF of the same: 

Summary of article:

Sweden is again leading the way with progressive policy by offering tax breaks to its citizens on the repair of their used or broken items, thus incentivizing their reducing waste by precluding the repair's prohibitively costing the near full price it often otherwise would to fix.  

Reason article was selected:

Do we as a society behave unethically by replacing and throwing away our items, and, specifically, our technology, too often and soon?  Should our government enact progressive taxation so as to, as Sweden’s finance and consumption minister Per Bolund describes it in the article, 'nudge' consumers to make right decisions for the sake of the health of our planet?

Personal/social values at stake:

While it's unethical to be wasteful, and I am a big proponent of repairing many items, certainly those that were ideally of high quality to begin with, the nature of technology is specifically not newfangled, but, rather, constantly innovative and progressive.  Therefore, technology's rapid development makes it an exception to this rule.  Users greatly benefit from the frequent replacement of their devices after they become outdated within just a couple of years.  Moore's law, which refers to an observation made by Intel co-founder Gordon Moore in 1965, is that the number of transistors per square inch on integrated circuits doubles every year, as they have since their invention.

Yet, then again, while Moore's law predicts that this trend will continue into the foreseeable future, some say it will end in 2020 or so.  And, even now, in contradistinction to just a few years ago, I can note that many I know are fatigued with the thought of upgrading their iPhones, as the latest generation (seven) fails to pique interest.  More and more are opting to replace a broken screen, for example, rather than the entire device.  Similarly, for another example, the MacBooks have been getting limited updates over the course of several years and are still so avant-garde.  Therefore, as technological advancements decrease, it becomes more sensible to repair even these, and a tax break for doing so would be welcome.

Creditability of the source:

Fast Company describes itself as a leading progressive, business media company, which was launched in November 1995 by Alan Webber and Bill Taylor, two formerHarvard Business Review editors.


Tuesday, September 20, 2016

Ethics and Security of Self-driving Technology

Scrapbook #3: Team of hackers take remote control of Tesla Model S from 12 miles away

PDF of the same: 


Summary of article:

A team of Chinese security researchers from the company Keen Security were able to remotely access controls inside a Tesla Model S. They deployed a malicious wifi hotspot and gained entry through the web browser within the car. 

Reason article was selected:

I chose this article because it raises the ethical question of producing self driving software that may endanger the lives not just the operators of the vehicle, but also the populous at large.

Personal/social values at stake:

Innovative Technology can be thrilling, but when it’s dangerous we as a society must weigh the risks of bad actors who may use this technology for harm.

Self driving does have advantages as it can be safer than a human driver, but it is not without risks of its own.

Creditability of the source:

The Guardian is a reputable news source

Tuesday, September 13, 2016

On the Facebook Content Control System's Imposition of American Values

Scrapbook #2:  On the Facebook (FB) Content Control System's Imposition of American Values, regarding the Telegraph UK article, "Facebook is Imposing Prissy American Censorship on the Whole Rest of the World," by Jane Fae

Link to the article, which appeared in the Telegraph UK: http://www.telegraph.co.uk/technology/2016/09/12/facebook-is-imposing-prissy-american-censorship-on-the-whole-res/

PDF of the same: https://drive.google.com/a/csumb.edu/file/d/0B3hQr_XgIHuEMjBEbEg0bDF2eE9uUS0tZ1lPX0E5SlRmN2c0/view?usp=sharing

Summary of the article:

Telegraph writer Jane Fae recounts the censorship that resulted, over the last week, in the summary removal of multiple FB posts in Norway of Nick Ut's iconic Pulitzer-prize winning, Vietnam war era photograph, "The Terror of War," which shows children, including a naked nine-year-old Kim Phuc, fleeing a napalm attack.  

The historical image continues to be deeply disturbing to this day and is poignantly recognized for its importance in revealing the harrowing reality of the war.  

Norwegian author Tom Egeland's FB account was suspended after he posted the photo with regards to a status concerning photos that "changed the history of warfare." Facebook has implemented algorithms for reviewing user-reported images in order to address accusations of bias, and child nudity is prohibited, in any case.  Protests over the suspension that were made on Egeland's behalf followed from prestigious places, firstly the editor-in-chief of Norway's largest newspaper, Aftenposten, and the Prime Minister of Norway, Erna Solberg.  Their posts were summarily censored, as well, causing outrage and resulting in accusations of abuse of power on the part of FB.

Fae describes the content control as haphazard and summary in nature. She further decries the imposition of American values on the rest of the world and particularly her native Britain via the editorial control FB wields, which she describes as "middlebrow frat boy liberalism."  Similarly, she writes, "So yes, laugh, but understand that Facebook's immense cultural influence is pervasive and pernicious: an anaemic American liberalism dressed up as high-mindedness which few people in government, until recently, have been prepared to stand up to."

Reason article was selected:

The impact of a globalized social network on a country's society and culture raises ethical questions.

Personal/social values at stake:

While it's acceptable to me that there should be some automation in censorship, based on guidelines, there should have been some humanized, thoughtful intervention made before the newspaper editor and prime minister had to get involved, and then, certainly, once they did, rather than a continuance in their posts' summary removals.

FB had good intentions when it attempted to prohibit publication of paedophilic images.  In this case, it should have thoughtfully considered that such does not apply, and that this image is one that must not be censored due to its historical importance.  However, I can see why there could be some doubt on the part of a censor who does not appreciate the context, and think it is understandably controversial.  The image is terrible.

I think it's unreasonable to request that FB go uncensored.  The company has no choice to then make the best guidelines it can or grant each country its own panel who can best determine what is right within their own locales.  Since FB is American and a company, however, they'll have to protect their legal interests and our nation's mores will undoubtedly shine through in rudimentary guidelines.  We are not as comfortable with nudity as the Europeans, who neither would accept this of children, which is not at issue.  For example, many Americans approve of and others object to the display of breastfeeding, while violence is depicted here with less qualms than in other parts of the world.  These are fair observations that the writer makes.

Currently, many Europeans charge that they are being beset by an onslaught of forces that are threatening their historical and cultural makeup, and to these claims I am very sympathetic.  I don't want to dismiss the writer's concerns as being a result of this general sentiment of late, and I am usually the first to criticize America when that criticism is due because I think we have a tendency to rest on our laurels and deny it when we are in the wrong, yet I can't agree with the reasonableness of these charges.  Our values are defensible and acceptable, in any case they are what they are, and the characterizations made are too harsh.  But, I can respect the need for localized censorship on a country by country basis, as well as a more personal censorship process that is less authoritarian, offensive or Kafkaesque.

Yet, I think FB's handling of the situation acknowledged that these issues are difficult, and that they are doing their best to protect, in this case, a value that we are proud to stand up for anywhere in the world, which is the vileness and wrongness of paedophilic images.  Furthermore, they should differentiate between countries who share our values to some extent and those that we consider repressive and unethical.

Credibility of sources:

The article is an op-ed from a reputable news source.

Tuesday, September 6, 2016

Sunday’s Malicious DDoS Attacks against Linode

Scrapbook #1:  Sunday’s malicious DDoS attacks against Linode and the article, “The Twelve Days of Crisis – A Retrospective on Linode’s Holiday DDoS Attacks” by Alex Forster

Links:

Linode’s Live Status Updates from Sunday, 09/04/2016
https://status.linode.com/

Linonde’s Retrospective by Alex Forster on another DDoS attack, published earlier this year, 01/29/2016
https://blog.linode.com/2016/01/29/christmas-ddos-retrospective/  

The above in PDF format:
https://drive.google.com/a/csumb.edu/file/d/0B3hQr_XgIHuEZmhaNXV2ZWVWMjlQLUpLSktYWkpMcXhHWnlN/view?usp=sharing

https://drive.google.com/a/csumb.edu/file/d/0B3hQr_XgIHuEcURJYzRINlI4RWN1NlFZakdLNlNmM0lPYzkw/view?usp=sharing

https://drive.google.com/a/csumb.edu/file/d/0B3hQr_XgIHuEZ1NVeW1aSmhTMlQ1R0RoVUNyTlBmYUtOMm9F/view?usp=sharing

Summary of the article and status updates:

On Sunday morning, September 4, 2016, Linode, a company that provides virtual private servers (KVMs) or cloud-hosting, began reporting, via their website's status updates page, that their Atlanta regional data center was being hit with dedicated denial of service (DDoS) attacks.

This is not the first time the company has been a target of such a malicious attack meant to damage its business, as I researched to find the article linked above, published in January of this year, in which the company provides a retrospective on an extensive and lengthy attack that took place during the last Christmas/Winter/New Year holiday season. In publishing the article, the company has sought to provide to its clients and interested readers a transparent report of the attack as well as a retrospective account of what was learned.

The specific attacks (numbered in the hundreds) on the Atlanta data center were volumetric in nature, according to Forster  “A volumetric attack is the most common type of DDoS attack in which a cannon of garbage traffic is directed toward an IP address, wiping the intended victim off the Internet. It’s the virtual equivalent to intentionally causing a traffic-jam using a fleet of rental cars, and the pervasiveness of these types of attacks has caused hundreds of billions of dollars in economic loss globally.” Forster writes, further, that it’s typical for Linode to get dozens of such attacks each day, for which there response tool is remote-triggered blackholing. “When an IP address is ‘blackholed,’ the Internet collectively agrees to drop all traffic destined to that IP address, preventing both good and bad traffic from reaching it,” the author writes. Blackholing fails or is ineffective, he goes on to explain, when the targeted IP is a critical piece of their or their colocation providers’ network infrastructure (e.g., API endpoints or DNS servers) that affects many others’ connections. Additionally, the article explains, Linode’s customers have secondary IP addresses on their routers, which are susceptible to attack and, in this case, were subject to dozens of simultaneous attacks. Mitigation was manual, so exceptionally challenging, slow and error-prone; also, only so much blackholing can be done at any one time because it may also be subject to error. Finally, the colocation providers’ crossconnects also became the subject of attacks. He writes, “a crossconnect can generally be thought of as the physical link between any two routers on the Internet. Each side of this physical link needs an IP address so that the two routers can communicate with each other, and it was those IP addresses that were targeted.” The attacks were unpredictable and many in number, if not entirely novel in nature.

In the statement, Forster of Linode additionally shared that they felt apologetic, humbled by the experience and that lessons were learned, specifically: 1) don’t depend on middlemen, i.e., the colocation partners for IP transit; 2) absorb larger attacks, i.e., increase IP transit capacity ; and 3) do a better job of letting customers know what’s happening, which they successfully did on Sunday.

Reason article was selected:

That malicious DDoSing or non-white-hat hacking is unethical is largely uncontested.  (Next week, I will choose to write on an issue that is more controversial, perhaps.)  Still, I selected the article because it’s a timely, fascinating topic regarding the ethics of the internet and computing, when hackers can cause so much harm from the dark depths of the digital realm.

Linode has handled the situation very well, providing informative status updates to its customers in real time, so that customers’ loyalty is likely unshaken.  Indeed, in the comments section of the article, many customers wrote that they were grateful for Linode’s articulated response and its employees’ hard work to correct the issues over their holiday season vacations.  They expressed they were understanding of the difficulty of fighting a fire and simultaneously reporting on it while doing so.

Personal/social values at stake:

The hackers' actions are not only unethical, but illegal, harming an honest company’s services for which customers have paid and have businesses that rely.  As far as who were/are the perpetrators -- one may simply give blame to anonymous, irrational actors, hackers bent on mindless disruption, or, alternatively, perhaps, unscrupulous competitors who wish to damage their opponent’s business.

Credibility of sources:

Both documents are primary sources, provided by Linode company employees, all representatives of the victim of the DDoS hacking itself.

Tuesday, August 30, 2016

session_destroy()

I've really enjoyed delving deeper into internet programming with this class and wish we had more time to cover the material since it was so extensive.  Coding is preferable to writing prose and contemplating ethics, for me, so I wish this class were sixteen weeks and continuing on, rather than coming to an end.  I especially enjoyed studying AJAX and PHP.  I'm most glad to have learned about PDO database connections, GET and POST methods for client/server requests/responses like in the processing of form data, as well as uploading files with PHP, validation, user authentication, sessions, cookies, password encryption -- these were all new to me in the context of coding, whether we were using SQL, JQuery, PHP or AJAX.  My studies must continue before I reach a level of proficiency with which I will be pleased, but it's been a good start, a good foundation, and I am confident my hard work will eventually result in success.  Looking forward to applying what I've learned on web projects.

Tuesday, August 16, 2016

JavaScript/JQuery

I don't have much to say this week, except that it's been good to review what I have studied in the past of JQuery and JavaScript.  I was surprised to learn that it's now recommended that small, internal scripts be placed between the head tags, rather than at the bottom of the body.  (If it is not small, one should link a separate .js file.)  Otherwise, I need to read a lot more documentation and step up my game, so as to move away from the elementary level, where I am currently.

Tuesday, August 9, 2016

Group Project: Streamflix

Our team project is a netflix-like clone with a webpage and databse for an online movie-streaming service, called Streamflix:

http://hosting.otterlabs.org/classes/voss8812/CST336/assignments/streamflix/streamflix.php

It's designed to be intuitive, users can register for an account, and you can see some of the movies that are available.

There is also a webpage for database manipulation language type commands, which you can visit via the link below, but please do not alter (too much), lest we lose the records we want added for the users of the page above:

http://hosting.otterlabs.org/classes/voss8812/CST336/assignments/streamflix/streamflix_categorize.php

This week, some of my team mates really stepped up and wrote some complex code.  We used JIRA for the first time, which proved an excellent resource for distributed task management -- I love it!

Tuesday, August 2, 2016

PHPmyAdmin

PHPmyAdmin is a simple GUI for importing and administering mySQL.  It auto-generates everything, which is efficient, yet procludes some detailing.  Or at least I had some issues with, specifically, changing some of the data types from VARCHAR(10) to DATE() or TIME(), after importing the .CSV, where relevant.  The intricacies of the GUI may have eluded me here, perhaps it required simply reloading the page, on second thought.

This is all great to learn, and I am driven to explore all angles of this material.  I'm also continuing to enjoy the exposure to the back-end, which is pretty new to me.

My colleague Joe listened to my griping about the lack of a task manager for our group assignments and took the initiative to set up our group with an Atlassian JIRA account, which I think is pretty freaking awesome.  Thanks, Joe!  Trello, Asana, might be some free options, but we really wanted to choose the one that we would use in the workplace and get more exposure to it.  Plus there's the Github integration, he says, which I'll look into this week as we complete the assignment.


Tuesday, July 26, 2016

PHP and Forms

Knowing how to use PHP to request/send data to a server and specifically process or collect form data is very important, seemingly a top use case for a typical website.  It's been valuable to brush up on my use of forms elements and attributes, as well as learn more about PHP.  This week, we've primarily used arrays as well as $_GET and $_POST to collect form-data.  $_GET reveals its data in the URL, so I tended to not want to use it in favor of $_POST.

I've been working at an internship over the last several weeks.  Last week was rather rough, and now I'm delayed in completing this week's assignment, which is disappointing.  By tomorrow, however, it should be ready for submission.

Tuesday, July 19, 2016

Intro to PHP

Learning PHP is delightfully less challenging than I thought it would be, given I've already studied other programming languages.  It's definitely the case that each new language approached is easier to use than the one prior....  Some idiosyncratic syntactical quirks include the period-style concatenation.  Otherwise, there is always the $output = sprintf("With result: %s and %s", $var1, $var2);

Also, PHP doesn't seem very strict, which actually may make things more difficult in a way....  We'll see.

I'm working on a program that will try to utilize global, static and local variables, reviewing the differences and ways to utilize these, but within multi-nested for loops and conditionals, it has sometimes been difficult to determine with certainty.  Specifically, I'm producing a labyrinth for this week's assignment.  So far, I have not been clever enough to give it a randomly generated circuitous pathway, as I had intended, so it's not very impressive and merely showcases the elements, variables, loops and statement types we're studying this week.  While it looks pretty pathetic, it's actually been fun to try to see how many ridiculous conditionals I can produce in the futile attempt to make it challenging.  Here is one random instance; just imagine really high walls and intimidating creatures roaming within:



Tuesday, July 12, 2016

Internet Programming!

I didn't think I wanted to pursue a career, specifically, in front-end web design and development, but, I must say, writing HTML5 (Hypertext Markup Languge) and CSS3 (Cascading Style Sheets) this week was a lot of fun, very enjoyable.  My personal website utilizes the bootstrap framework and showcases links, tables, images, various other elements and attributes and a little bit of CSS3's WebKit for Safari and Chrome.  My goals were to keep the DOM organized, without redundant or thoughtless markup, and to imitate something of the look of modern web design.  There's room for improvement in these respects, but I'm happy to have produced this product and am inspired to write more and keep it up.  Also, I've set up a foundation on which I could add JavaScript or JQuery, in the future.

Configuring a connection to the host on Aptana wasn't difficult, but maintaining the file path in the Project Explorer was, somewhat; like Eclipse, with which I have some experience, it's rather finicky.  I've got what appears to be duplicate paths, where one is the remote and the other local.  I work in the local system, which saves to my computer, and then synchronize the two, via the latter's connection to the host, to push changes.  (On one occasion, I had re-opened the closed software to find my web project had disappeared -- this probably due to an issue with space on my hard-drive(!) which had caused my computer to crash and not save properly -- so I don't know if my re-writing the files or re-downloading them from the host was a causal factor in all this or whether it is always the case, regardless, so I'll need to look into it further and hope there aren't any issues going forward!)  Otherwise, Aptana's predictive text capability, if you will, is really, very convenient, as is the sort of confirmation it gives of the validity of elements and attributes.

Friday, June 17, 2016

SQL Final Project

Our team decided to design and build a music streaming database for our final project in SQL, something similar to Spotify and consisting of artist, album, subscriber, playlist, etc. entities and their corresponding relationships and attributes.  Both the processes of working with the team and reviewing the many lessons and concepts we've learned over the course, thereby, proved to be educational and, also, enjoyable.  One of the biggest challenges I've faced working with a team in the classroom setting has been the lack of both managerial direction and subsequent communication to fill that void.  When time constraints exist, duplicating work and comparing output tends to be a luxury for which there is insufficient resource.  Organization becomes critical in planning and delineating responsibilities, as does a sense of urgency and timeliness in executing on the project, a challenge given diverse schedules and commitments outside the program.  I think it must be somewhat similar to the workplace environment, obviously, absent the manager, since we are all colleagues here of equal footing, and these are typical challenges any group and group project must face.  But I just think there was significant improvement, this time, in all these respects, and I feel very good about our working situation, which is very pleasant.  Working with a team benefitted me in that I gained greater clarity over, understanding of, SQL rules, best practices, code and concepts, such as those in database modeling and writing the SQL commands, sometimes via some rigorous, constructive discussion/debate.  Each team member contributed a great deal and brought their strengths/knowledge for the benefit of the project.  We wish we had more time to finish the final and add additional code to showcase what we've learned, given the deadline was moved forward, but all aspects of the spec. have been addressed, and it's satisfactory to submit on time.  I've read that, in the professional workplace, this is also an issue, where one doesn't want to complete a project because one feels there is still more to be done; yet, it's best to just deliver it, to some extent.

Tuesday, June 14, 2016

Developing and Refining my Understanding of some more Functions and Operations

Some of the lessons this week involved processing and formatting text using various functions, such as concatenation, which has a rather strange | | syntax, upper/lower, substring, to_char, year, month and other date/time related functions.  Additionally, we've explored creating sequences for automatically generated, consecutive/numerical (datatype is object) identity attributes, which comes in handy with table entities that lack good, natural candidate keys.  The create view operation is also handy for generating a storable, virtual tabular report based on a more lengthy select query that can be re-run based, with the select operator, on its given variable name alone.

I've also learned that SQL does not support conditional or looping operations like other programming languages; however, persistent stored modules (PSMs), blocks of code containing standard statements, can be saved and run.  MS SQL uses Procedural Language SQL (PL/SQL) to store and run or anonymously run such procedural code traditionally used in programming within the database.  The blocks can use DECLARE or BEGIN/END at each end.  The former can be saved with a variable name, which is convenient for business operations.  Similarly, stored procedures and triggers can automatically invoke such procedural code blocks.  With respect to triggers, the text gives one example of the updating of a product inventory number upon a sale and the flagging of inventory levels when quantity falls below a certain number, so items can be re-ordered.  (These are not things one should have to remember to periodically do.)  The syntax involves that mentioned just above plus timing and event indications.  Otherwise, as throughout, I've tried to focus, when in situations where multiple solutions exist, to use the most efficient or precise command or function to achieve the desired result.

Tuesday, June 7, 2016

Penultimate SQL Lesson before Finals Week

This week's assignments involving SQL Plus have been enjoyable, as it gives me more pleasure to get into the tool and interact with it than design diagrams, although that's not to say that the latter hasn't been important in learning database modeling and design fundamentals.  In SQL Plus, we're continuing to spool our file runs, and now applying SQL aggregate commands, as well as Oracle SQL commands to prompt input requests and accept them.  Further, we're compounding queries to make subqueries based on the one directly prior.

Tuesday, May 31, 2016

Entity Relationship Modeling and Diagraming

This week, we're learning how to design entity relationship diagrams to graphically depict various components of entities, attributes and their relationships.  Hopefully, it will enable me to better implement database designs.  There's a lot to take in, so reviewing components previously studied in past modules is necessary.

Tuesday, May 24, 2016

Normalization

Normalization is an analytical process that yields good database design and table structure with minimal data redundancy, and therefore, less anomaly, predicated on the concepts of attribute determination and dependency.  The process works through stages of lower to higher normal forms, as well as Boyce-Codd Normal Form (BCNF).  (Sometimes, denormalization or less normalization is best, for some examples, when one needs to better reflect an organization's real operations or when end-user demand requires faster querying because more resources may be required or more relational join operations must be performed with higher forms.)  However, the first through third normal forms should almost always be implemented, along with the BCNF, when analyzing database design.  A table in the 1NF has labeled dependencies, identifiable primary keys and attributes and lacks repeating groups, where one piece of data represents a set of data, leaving nulls instead of an appropriate data value.  To be in the 2NF, a table must, in addition to the above 1NF requirements, be without partial dependencies, where one attribute is dependent on part of a composite primary key determinant.  To be 3NF compliant, in addition to the above, a table must lack transitive dependences, where a nonprime attribute is determinant of a nonprime dependent.  In both cases, the determinant is copied and made a primary key in its own table with the dependent, which is extracted from the table of origin.  Lastly, for discussion here, is the BCNF, which may only be violated when the table contains more than one candidate key, when a non-key attribute determines a prime key attribute.  It can be resolved by reassigning the primary keys.  A table is in BCNF when all determinants are candidate keys.

Tuesday, May 17, 2016

Advancing with SQL

Through this week's assignments, I've developed a greater understanding of the relationships and interactions between tables.  Creating foreign keys requires establishing those tables first, as well as setting up associated primary keys, and joining tables requires consideration be given to the variety of data queries one may make.  Null values in the place of common, connecting attributes, for example, requires outer joins from left or right dependent on where the otherwise excluded pieces occur.  Other times, a simple natural join is the quickest command, if one is further specifying certain conditions be met, which would make any excluded bits of data irrelevant.

We've used spool commands to capture our output, which is very helpful for debugging, because it's far easier to re-run a file than to re-type the code.  We've also separated our creation and modification files to help to this end, which is a best practice I'll apply in the future.

I had some debugging issues with parent integrity, due to a mismatched character data type between a primary and foreign key.

Otherwise, I'm reminded that double- and triple-checking data is still sometimes not enough to catch every typo, certainly as the database grows, which is frustrating.  Sometimes, reading code backwards (right-to-left) to view data from a new perspective or taking a break and returning to review insertion commands later, can help.

It's nice to see Database Administrator jobs posted -- 189 today in SF alone on indeed.com -- and to be learning some of the skills required in the specs.  I also understand that software developers, engineers and analysts alike all use SQL in their work, more or less.

Tuesday, May 10, 2016

SQL

This week, it's been exciting to have been studying SQL (Structured Query Language) commands for creating databases and relational table structures, performing various types of data manipulations and administration and querying the database to retrieve useful information!  These include commands and options such as INSERT, SELECT (FROM, WHERE, GROUP BY, HAVING, ORDER BY), UPDATE, DELETE, COMMIT and ROLLBACK, along with comparison or logical operators and aggregate functions.  Interestingly, SQL is a relatively simple language to learn, comprising a vocabulary of fewer that 100 words.  We've also studied more on relational databases and some basic relational algebra and its operators: UNION, INTERSECT, DIFFERENCE, PRODUCT, SELECT, PROJECT, and JOIN.  I've realized, contrary to the musings of last week's blog, that the SQL 'dialects' (such as Oracle, MySQL) are rather similar, so that in studying and learning one, I can more easily work with any of them.   Furthermore, according to Coronel et al., Oracle was number one in the RDBMS sector in 2010....  Anyway, the industry is always changing, and it'll be interesting to see what happens to DBMS as unstructured data, Big Data, and with it NoSQL, become more prevalent.  Still, it seems there will long, if not always, be a need for structured database models for traditional information, of which the relational one is tops.

Tuesday, May 3, 2016

Relational Database Management Systems

According to Coronel, Morris and Rob, databases are shared and integrated computer structures that store a collection of end-user data and metadata in a methodical way that is much more dynamic, malleable and synchronized than the traditional file system.  Databases avoid some flaws of the file system, for example, it's possible to add single bits of unconnected data without the requirement of adding filler data to a ledger's expansive columns and, secondly, deletion/edit anomalies no longer occur.  It's easy to understand how databases could be valuable when a company has trillions of pieces of raw data to manage, process and store.  Also, databases allow for quick transformations of raw data into graphical or tabular presentations of more meaningful information, which improve business decision-making.

I have no prior education nor experience in database modeling, i.e., representing and storing data, nor management software, which is used to create/generate, manage, store and retrieve/query the interrelated data by the single or multi-user end user(s), so I'm looking forward to learning it, other database concepts and statement writing, and, in particular, MySQL.  According to Coronel et al., databases result in improved data sharing, security, integration, productivity and access, as well as minimized inconsistency.  Single users have desktop databases and multi-users either workspace or enterprise databases.  Also, if the data is located at a single site, it's centralized, in contrast to distributed.

General purpose databases are varied and discipline specific ones will be more specific to its related discipline, such as medical of financial records.  Alternatively, operational databases may focus on something like a transaction occurring day-to-day.  Analytical databases focus on storing metrics used for tactical decision making, having "massaged" or manipulated raw data to extract valuable information.  This data is made structured by the processing of raw data to business intelligence.  Most databases mentioned above will use at least semi-structured data, perhaps that made textual by XML.

The specific database model we will study is the predominant relational one, which has the logical structuring of related tables, connected by super, primary, candidate, foreign and other keys, central to its system.

After listening to class orientation, my understanding is that we'll largely be studying relational database models and primarily using Oracle, in particular, with a focus on foundational theory and design, so I'll be interested to compare the two software and study implementing the former, i.e., MySQL, on my own, if necessary, given its ubiquity in the workplace.  (I'm hopeful the divide between theory and practice won't be too great, in this respect, as is sometimes the case in academia, and that I'll develop not only a foundation here but a working, transferable knowledge that will be applicable in a professional capacity such as database developer, designer, administrator, architect, consultant, etc..)  My aim is to gain a working proficiency that includes implementation of these, and I'm looking forward to this class!   Still, I appreciate the value of learning strong design fundamentals, which is crucial.

I had a slight delay getting started this week, as I went to the the south bay to collect a PC from my sister, who's agreed to let me borrow it for the length of the class.  There were no issues installing Oracle, which I left overnight and returned to in the morning.

Reference

Coronel, Carlos, Steven Morris, Peter Rob, and Carlos Coronel. Database Principles: Fundamentals of Design, Implementation, and Management. Vol. 10. Mason, OH: South-Western, 2013. Print.

Wednesday, April 20, 2016

Python Course End

The three key take-aways from my final project -- a reverse spectrogram that transforms images to sounds:
  • Multimedia is remarkably fungible, if you will, thanks to digital data/digitization.  Audio bit depth and amplitude can numerically relate to image pixels.
  • Image and sound manipulation in Python.
  • Pair programming is an acquired skill that requires practice. Its challenges for me involved differing skill levels between team-members, my ability to concentrate or focus in a group setting, my ability to articulate/communicate difficult or abstruse concepts.  Some solutions involve delineating a clear plan and outline of responsibility, improved efforts to communicate, patience and research.
It was a fun project to build, but please lower the volume before playing because it is highly discordant!  Brace yourself; here is an example of output:








Tuesday, April 12, 2016

Entering the Final Stretch

This week, we studied some Python data structures or storage containers like lists and dictionaries.  It's really incredible what Python enables one to do with so few lines of code.  The more structures and built-in functions I learn, the smaller my programs seem to get.

The documentation provided on soft-skills like code-review has been a very good read.  The best practices were particularly valuable:

     1. Review fewer than 200-400 lines of code at a time.

     2. Aim for your inspection rate of less than 300-500 LOC/hour

     3. Take enough time for a proper, slow review, but not more than 60-90 minutes

     4. Authors should annotate source code before the review begins.

     5. Establish quantifiable goals for code review and capture metrics so you can improve your      processes.

     6. Checklists substantially improve results for both authors and reviewers

     7. Verify that defects are actually fixed!

     8. Managers must foster a good code review culture in which finding defects is viewed positively

     9. Beware the “Big Brother” effect

     10. The Ego Effect: Do at least some code review, even if you don’t have time to review it all

     11. Lightweight-style code reviews are efficient, practical, and effective at finding bugs

Our team is working on the outline for our final project, and there will be more on that next week!  It's going to be pretty cool, involving the reverse transformation of images into sound.  It'll be pretty cacophonous -- I don't think we'll be implementing any music theory on harmony, but we'll see what we can do.  Incidentally, I've used the word "cool" about ten times today.  I think it's an unconscious mantra as I enter finals and am beginning to feel the burn.

Wednesday, April 6, 2016

Module 5

It's good to read in the HuffPost, in Silicon Valley’s Race to Hack Happiness, that coders are trying to build apps that aim to increase people's happiness and add to the greater good in society.  These efforts are laudable, and it's good to see hackathons like Wisdom 2.0 and the Happiness App Challenge further promote them.  The article claims that anxiety and depression are on the rise in our society. (Is that really the case?  Do we now tend in greater numbers to lead more isolated, stressed, nasty, unfulfilled lives than our forbearers did?  Or, rather, is there simply now more acceptance towards and openness regarding unhappiness, individualism, or mental health or illness, as the case may be, that has led to an increased discussion or reporting of either?)  In any case, again, since people spend so much time on their smartphones, increasing happiness is a highly commendable development project, which I will look further into in hopes of joining.

Wednesday, March 30, 2016

Module 4

This week, we have begun to learn about and work with sound files, including the concepts of sampling, amplitude, and bit depth.  It's an interesting, complex application for Python, which I never before thought I would take on.  Overall, working with a variety of media has been a lot of fun -- and this is coming from someone who does not really consider themselves a creative -- and I think it's been a great medium for studying programming.

I've found the Github tutorial to be exciting, as well as the fact that my team will start pushing and pulling, etc., the code that we share.  It's a great protocol to know for working with a team, so we'll be getting a lot of practice using it and hopefully becoming more and more familiar with it.


Friday, March 25, 2016

Module 3 - Keeping up with my Learning Journal

Pair programming is more difficult for me than coding solo.  I appreciate that the driver is free to focus on tactical maneuvers when the navigator is observing strategy and prospective future issues, but it's difficult for me to 1) concentrate when working in front of others, 2) articulate what I am coding while I'm thinking it through, and 3) always interpret what others are writing, while they're doing so.  There's quite a lot of silent time, but that's OK.  I wonder if others share this perspective, find it a challenge, as well.  Of course, ultimately, it's helpful and worthwhile to have constructive, valuable feedback from the navigator.  Hopefully, with practice I will become more comfortable with this method of writing.

Sunday, March 20, 2016

Module 3 - Image Manipulation Portfolio


Image Manipulation Portfolio

Over the last couple of weeks, we've explored Python multimedia programming through the Jython Environment for Students (JES). The functions below require arguments be passed in when called, and when that's a picture or image, it necessitates our having first created and stored appropriate variables via the command line (pickAFile() and makePicture()).

1.  Rose Colored Glasses
Reading over some of my colleagues' code, I have learned that this function needn't be so complicated as my approach below.  It's enough to loop through an image's pixels and alter/set each RGB value by a percentage relative to red.  Nevertheless, here I used the same approach a lot of filters use, firstly converting the image to BnW by invoking a BnW function, looping through that image's pixels,  and setting the red and blue to varying degrees dependent on the level of shade, mid-tone or highlight.

def roseColoredGlasses(pic):
  """problem 3 takes picture and repaints it so that images appears a rosy pink"""
  grayscale = betterBnW(pic)    # convert image to grayscale by calling betterBnW()
  pixels = getPixels(grayscale)
  for pix in pixels:
    r = getRed(pix)
    b = getBlue(pix)
    #tint shadows
    if r < 63:
      r *= 1.1
      b *= 0.9
    #tint midtones
    elif r >= 63 and r < 192:
      r *= 1.75
      b *= 0.85
    #tint highlights
    elif r >=192 and r <= 255:
      r *= 2.48
      b *= 0.93
    else:
      r = 255
      b *= 0.93
    setRed(pix, r)
    setBlue(pix, b)
  rose_colored = grayscale
  repaint(rose_colored)
  writePictureTo(rose_colored, '/Users/lisabenson/Documents/CST205/Module3/rose_colored_glasses.jpg')
  return rose_colored

Calla lilies, before and after:



_______________________________________________________________________________
2.  Negative
This function loops through an image's pixels and sets each RGB scale value for each to an opposite value.  The opposite value of 0 is to be 255, the opposite of 1 is to be 254, and the opposite of 127 to be 128.

def makeNegative(pic):
  """creates the negative of original picture"""
  pixels = getPixels(pic)
  for pix in pixels:
    r = getRed(pix)
    g = getGreen(pix)
    b = getBlue(pix)
    new_r = 255 - r
    new_g = 255 - g
    new_b = 255 - b
    setRed(pix, new_r)
    setGreen(pix, new_g)
    setBlue(pix, new_b)
  repaint(pic)
  #writePictureTo(pic, '/Users/lisabenson/Documents/CST205/Module2_Lab3/negative_pic.jpg')
  return pic

CSUMB campus artwork, before and after:



_________________________________________________________________________________
3.  Better Black and White
This function simulates a black and white image by converting the color image to grayscale. Each RGB value in a pixel must be the same, e.g. 0, 0, 0 or 1, 1, 1. One way to do this would be to take the average of the RGB values of a pixel and reassign it to each one, but a better formula is for luminance at R*0.299 + G*0.587 + B*0.114, which will similarly be the new value of each RGB value of each pixel looped through in the function.

def betterBnW(pic):
  """makes an image grayscale with luminance formula R*0.299 + G*0.587 + B*0.114"""
  pixels = getPixels(pic)
  for pix in pixels:
    r = getRed(pix)
    g = getGreen(pix)
    b = getBlue(pix)
    luminance = r*0.299 + g*0.587 + b*0.114
    setRed(pix, luminance)
    setGreen(pix, luminance)
    setBlue(pix, luminance)
  repaint(pic)
  return pic  # if function was called as a helper function, comment out repaint(pic) and use return pic

Monterey Bay sand dunes, before and after:



_________________________________________________________________________________
4.  Bottom-to-top
This function mirrors the bottom part of the image, so that the top part is its reflection.  It loops through the image to be copied and sets the pixel to the opposite location.

def bottomToTop(pic):
  width = getWidth(pic)
  half_height = getHeight(pic)/2
  height = getHeight(pic)
  for x in range(0, width):
    for y in range(half_height, height):
      px = getPixel(pic, x, y)
      opp_px = getPixel(pic, x, height - 1 - y)
      c = getColor(px)
      setColor(opp_px, c)
  repaint(pic)
  writePictureTo(pic, '/Users/lisabenson/Documents/CST205/Module2_Lab4_Voss/bottom-to-top.jpg')

Bicycles parked outside the CSUMB STEM building, before and after:




_________________________________________________________________________________
5.  Shrink
This function shrinks a picture.  Firstly, we create a new, shrunk, empty picture with reduced width and height.  Then, we loop through the width and height of the original image in increments of the reduction, so to get half the size we'll loop by increments of two.  We then reconstitute the new image by setting the old pixel values to the shrunk images value that meanwhile have incremented by only one.

def shrink(picture):
  """make a copy of a picture that is half as big as the original"""
  width = getWidth(picture)
  height = getHeight(picture)
  shrunk_pic = makeEmptyPicture(width / 2, height / 2)
  a = 0
  for x in range(0, width, 2):
    b = 0
    for y in range(0, height, 2):
      px = getPixel(picture, x, y)
      shrunk_px = getPixel(shrunk_pic, a, b)
      c = getColor(px)
      setColor(shrunk_px, c)
      b += 1
    a += 1
  repaint(shrunk_pic)
  writePictureTo(shrunk_pic, '/Users/lisabenson/Documents/CST205/Module2_Lab4/shrunk_pic.jpg')
  return shrunk_pic

Photograph of the letter "t" from the cover of Sunset Magazine (March 2016), before and after:





_________________________________________________________________________________
6.  Make A Collage
This function copies manipulated images onto a target image at specified x,y locations.



def makeCollage():
  target = makePicture(pickAFile()) # 2550, 3500  # 637 and 875
  show(target)
  pyCopy(makeNegative(), target, 0, 0)
  pyCopy(rotatePic(), target, 800, 0)
  pyCopy(artify(), target, 1600, 0)
  pyCopy(quadrupleMirror(), target, 1600, 800)
  pyCopy(bottomToTop(), target, 800, 800)
  pyCopy(halfHalfBlue(), target, 0, 1600)
  pyCopy(lightenUp(), target, 800, 1600)
  pyCopy(betterBnW(), target, 1600, 1600)
  pyCopy(sepia(), target, 800, 2400)
  pyCopy(redEye(), target, 0, 2400)

#-------------------------------------------

def pyCopy(source, target, targetX, targetY):
  """function creates a new, bigger blank picture and copy the picture to the middle of it"""
  width = getWidth(source)
  height = getHeight(source)
  # target = makeEmptyPicture(width + targetX, height + targetY)  # Optional to invoke with empty target picture of any size, which will be rewritten
  for x in range(0, width):
    for y in range(0, height):
      old_pixel = getPixel(source, x, y)
      color = getColor(old_pixel)
      new_pixel = getPixel(target, (targetX + x), (targetY + y))
      setColor(new_pixel, color)
  show(target)
  writePictureTo(target, '/Users/lisabenson/Documents/CST205/Module2_Lab5/mycollage_Voss.jpg')   return target

#-------------------------------------------

def makeNegative():
  """creates the negative of original picture"""
  filename = pickAFile()
  pic = makePicture(filename)
  pixels = getPixels(pic)
  for pix in pixels:
    r = getRed(pix)
    g = getGreen(pix)
    b = getBlue(pix)
    new_r = 255 - r
    new_g = 255 - g
    new_b = 255 - b
    setRed(pix, new_r)
    setGreen(pix, new_g)
    setBlue(pix, new_b)
  return pic

#-------------------------------------------

def rotatePic():
  filename = pickAFile()
  picture = makePicture(filename)
  width = getWidth(picture)
  height = getHeight(picture)
  rotated_pic = makeEmptyPicture(height, width)
  for x in range(0, width):
    for y in range(0, height):
      px = getPixel(picture, x, y)
      rotated_px = getPixel(rotated_pic, y, x)
      c = getColor(px)
      setColor(rotated_px, c)
  return rotated_pic

#-------------------------------------------

def verticalMirror(picture):
  width = getWidth(picture)
  half_width = getWidth(picture)/2
  height = getHeight(picture)
  for x in range(0, half_width):
    for y in range(0, height):
      px = getPixel(picture, x, y)
      opp_px = getPixel(picture, width - 1 - x, y)
      c = getColor(px)
      setColor(opp_px, c)
  return picture

#-------------------------------------------

def horizontalMirror(picture):
  width = getWidth(picture)
  half_height = getHeight(picture)/2
  height = getHeight(picture)
  for x in range(0, width):
    for y in range(0, half_height):
      px = getPixel(picture, x, y)
      opp_px = getPixel(picture, x, height - 1 - y)
      c = getColor(px)
      setColor(opp_px, c)
  return picture    


#-------------------------------------------

def quadrupleMirror():
  filename = pickAFile()
  picture = makePicture(filename)
  vertical_pic = verticalMirror(picture)
  quadruple = horizontalMirror(vertical_pic)
  return quadruple

#-------------------------------------------


def bottomToTop():
  filename = pickAFile()
  pic = makePicture(filename)
  width = getWidth(pic)
  half_height = getHeight(pic)/2
  height = getHeight(pic)
  for x in range(0, width):
    for y in range(half_height, height):
      px = getPixel(pic, x, y)
      opp_px = getPixel(pic, x, height - 1 - y)
      c = getColor(px)
      setColor(opp_px, c)
  return pic

#-------------------------------------------

def halfHalfBlue():
  filename = pickAFile()
  picture = makePicture(filename)
  for x in range(0, (getWidth(picture)/2)):
    for y in range(0, (getHeight(picture)/2)):
      px = getPixel(picture, x, y)
      b = getBlue(px)
      setBlue(px, b * .25)
  return picture

#-------------------------------------------

def lightenUp():
  filename = pickAFile()
  pic = makePicture(filename)
  pixels = getPixels(pic)
  for pix in pixels:
    old_color = getColor(pix)
    new_color = makeLighter(old_color)
    setColor(pix, new_color)
  return pic

#-------------------------------------------

def betterBnW():
  filename = pickAFile()
  pic = makePicture(filename)
  pixels = getPixels(pic)
  for pix in pixels:
    r = getRed(pix)
    g = getGreen(pix)
    b = getBlue(pix)
    luminance = r*0.299 + g*0.587 + b*0.114
    setRed(pix, luminance)
    setGreen(pix, luminance)
    setBlue(pix, luminance)
  return pic

#-------------------------------------------

# Sepia
def lum(color):
  # lum = R*0.299 + G*0.587 + B*0.114

  c = .229 * color.getRed() + .587 * color.getGreen() +  .114 * color.getBlue()
  return c

def sepiaInternal(color):
  r = b = g = lum(color)

  if r < 63:
    r *= 1.1
    b *= .9
  elif r < 192:
    r *= 1.15
    b *= .85
  else:
    r *= 1.08
    b *= .93
  color.setRGB(min(int(r), 255),int(g),int(b))

def sepia():
  filename = pickAFile()
  pic = makePicture(filename)
  height = getHeight(pic)
  width = getWidth(pic)
  for y in xrange(0, height):
    for x in xrange(0, width):
      # lum
      p = getPixelAt(pic, x, y)
      c = getColor(p)
      sepiaInternal(c)
      setColor(p, c)
  return pic

#-------------------------------------------

def redEye():
  filename = pickAFile()
  picture = makePicture(filename)
  for x in range(198, 274):
    for y in range(367, 450):
      current_pixel = getPixel(picture ,x, y)
      if (distance(red,getColor(current_pixel)) < 165):
        setColor(current_pixel, black)
  return picture

#-------------------------------------------
# Shawn's Artify
def artify():
  filename = pickAFile()
  pic = makePicture(filename)
  height = getHeight(pic)
  width = getWidth(pic)
  rlist = [31, 95, 159, 223]
  glist = [4, 17, 36, 73]
  blist = [22, 33, 180, 200]
  for y in xrange(0, height):
    for x in xrange(0, width):
      p = getPixelAt(pic, x, y)
      c = getColor(p)
      r = rlist[c.getRed() / 64]
      g = glist[c.getGreen() / 64]
      b = blist[c.getBlue() / 64]
      setColor(p, makeColor(r, g, b))
  return pic

                                         
_________________________________________________________________________________
7. Red-eye Reduction
For this function, if the red color sampled is within distance to a pre-specified level, the color will be replaced.  Targeting the locations in which to take samples limits smudging in areas away from the eyes.  I chose a bright green below to highlight the change.

def redEye(picture, replacementColor):
  for x in range(142, 194):   # targeted x,y locations limits smudging
    for y in range(263, 313):
      current_pixel = getPixel(picture ,x,y)
      if (distance(red,getColor(current_pixel)) < 165):
        setColor(current_pixel, replacementColor)
  repaint(picture)
  return picture

Cottontail rabbit, before and after:


_________________________________________________________________________________
8. Artify
This was a part of a pair programming assignment, so Shawn drove and wrote the function, and deserves credit here.  He firstly creates lists with random (0-256) red, green and blue numeric values, then loops through the image's pixels over a range of the height and width values of the image.  New RGB values are set when the to zero through four achieved by dividing 256 by each of the old RBG values yields the index of the list.

# Shawn's Artify
def artify(pic):
  height = getHeight(pic)
  width = getWidth(pic)
  rlist = [31, 95, 159, 223]
  glist = [4, 17, 36, 73]
  blist = [22, 33, 180, 200]
  for y in xrange(0, height):
    for x in xrange(0, width):
      p = getPixelAt(pic, x, y)
      c = getColor(p)
      r = rlist[c.getRed() / 64]
      g = glist[c.getGreen() / 64]
      b = blist[c.getBlue() / 64]
      setColor(p, makeColor(r, g, b))
  return pic



 
_________________________________________________________________________________
9. Green-screen
I used Adobe Photoshop to hastily lasso and cut a selfie and affix it to a green layer background.  Using explore() in JES, I sampled the RGB values of the green background to find a representative one, made it a color to be passed into my function, along with the background image and the distance from it, to which a color must range to avoid having its pixels gotten and re-set.  This was a part of a pair programming assignment, so Cian drove and wrote the function, and deserves credit here.  He wrote the function to work for pictures of different sizes.

# Cian's Chromakey
def chromakey(backg, foreg, dist, col):   # greenback = makeColor(56, 152, 24)
  if getWidth(backg) >= getWidth(foreg):
    for x in range(0, getWidth(foreg)):
      if getHeight(backg) >= getHeight(foreg):
        for y in range(0, getHeight(foreg)):
          if distance(getColor(getPixel(foreg, x, y)), col) < dist:
            p = getPixel(foreg, x, y)
            setColor(p, getColor(getPixel(backg, x, y)))
      elif getWidth(backg) <= getWidth(foreg):
        for y in range(0, getHeight(backg)):
          if distance(getColor(getPixel(foreg, x, y)), col) < dist:
            p = getPixel(foreg, x, y)
            setColor(p, getColor(getPixel(backg, x, y)))
  elif getWidth(backg) <= getWidth(foreg):
    for x in range(0, getWidth(backg)):
      if getHeight(backg) >= getHeight(foreg):
        for y in range(0, getHeight(foreg)):
          if distance(getColor(getPixel(foreg, x, y)), col) < dist:
            p = getPixel(foreg, x, y)
            setColor(p, getColor(getPixel(backg, x, y)))
      elif getWidth(backg) <= getWidth(foreg):
        for y in range(0, getHeight(backg)):
          if distance(getColor(getPixel(foreg, x, y)), col) < dist:
            p = getPixel(foreg, x, y)
            setColor(p, getColor(getPixel(backg, x, y)))
  show(foreg)
  return foreg





_________________________________________________________________________________
10. Home-made St. Patrick's Day Card

def stPatCard(background, image1, image2):
  background = rotatePic(background)
  new_image = addText(background)
  rainbow_image = addRainbow(new_image)
  pic_with_added_image1 = addImage(image1, rainbow_image, 20, 10)
  artify(image2)
  pic_with_added_image2 = addImage(image2, pic_with_added_image1,145, 550)
  show(pic_with_added_image2)
  writePictureTo(pic_with_added_image2, '/Users/lisabenson/Documents/CST205/Module3_Lab7/st-pat_card.jpg')

def rotatePic(background):
  width = getWidth(background)
  height = getHeight(background)
  if width > height:
    rotated_pic = makeEmptyPicture(height, width)
    for x in range(0, width):
      for y in range(0, height):
        px = getPixel(background, x, y)
        rotated_px = getPixel(rotated_pic, y, x)
        c = getColor(px)
        setColor(rotated_px, c)
    return rotated_pic
  else:
    return background

def addText(pic):
  gold = makeColor(204, 204, 0)
  style = makeStyle(mono, bold, 30)
  text = "Happy St. Patrick's Day!"
  addTextWithStyle(pic, 70, 390, text, style, gold)
  return pic

def addRainbow(pic):
  #Adds RGB rainbow to card
  purple = makeColor(153, 0, 255)
  addArcFilled(pic, 145, 500, 300, 100, 0, 180, purple)
  addArcFilled(pic, 145, 480, 300, 100, 0, 180, blue)
  addArcFilled(pic, 145, 460, 300, 100, 0, 180, green)
  addArcFilled(pic, 145, 440, 300, 100, 0, 180, yellow)
  addArcFilled(pic, 145, 420, 300, 100, 0, 180, orange)
  addArcFilled(pic, 145, 400, 300, 100, 0, 180, red)
  return pic

def artify(pic):
  height = getHeight(pic)
  width = getWidth(pic)
  rlist = [31, 95, 159, 223]
  glist = [4, 17, 36, 73]
  blist = [22, 33, 180, 200]
  for y in xrange(0, height):
    for x in xrange(0, width):
      # lum
      p = getPixelAt(pic, x, y)
      c = getColor(p)
      r = rlist[c.getRed() / 64]
      g = glist[c.getGreen() / 64]
      b = blist[c.getBlue() / 64]
      setColor(p, makeColor(r, g, b))
  return pic

def addImage(source, target, targetX, targetY):
  """function creates a new, bigger blank picture and copy the picture to the middle of it"""
  width = getWidth(source)
  height = getHeight(source)
  # target = makeEmptyPicture(width + targetX, height + targetY)  # Optional to invoke with empty target picture of any size, which will be rewritten
  for x in range(0, width):
    for y in range(0, height):
      old_pixel = getPixel(source, x, y)
      color = getColor(old_pixel)
      new_pixel = getPixel(target, (targetX + x), (targetY + y))
      setColor(new_pixel, color)
  return target

Here's the funny little St. Patrick's Day Card I made using addText(), addArcFilled() and others:



_________________________________________________________________________________
11. Advanced Image Processing Technique
This function changes a pixel according to the difference between it and the colors of the pixels to its right and the below.  Without limiting the range to one pixel less than the height and width, the loop would exceed their range when we search for the pixels to the right and below later in the function.   The absolute value of the difference ensures we work with a positive number.  For the images below, I chose to pass in a distance range value of 10 because anything higher resulted in a mostly white-washed result.

def imageProcessor(pic, dist_val):
  pic = betterBnW(pic)  # take a color picture and convert it to black and white
  for x in range(0, getWidth(pic) - 1):
    for y in range(0, getHeight(pic) - 1):
      pix = getPixel(pic, x, y)
      pix_below = getPixel(pic, x, (y + 1))
      pix_right = getPixel(pic, (x + 1), y)
      if abs(distance(getColor(pix), getColor(pix_below))) > dist_val and abs(distance(getColor(pix), getColor(pix_right))) > dist_val:
        setColor(pix, black)
      else:
        setColor(pix, white)
  repaint(pic)
  writePictureTo(pic, '/Users/lisabenson/Documents/CST205/Module3_Lab7/img_process.jpg')
  return pic

def betterBnW(pic):
  """makes an image grayscale with luminance formula R*0.299 + G*0.587 + B*0.114"""
  pixels = getPixels(pic)
  for pix in pixels:
    r = getRed(pix)
    g = getGreen(pix)
    b = getBlue(pix)
    luminance = r*0.299 + g*0.587 + b*0.114
    setRed(pix, luminance)
    setGreen(pix, luminance)
    setBlue(pix, luminance)
  return pic

Portrait taken on favorite holiday in Mürren, CHE, before and after:






Tuesday, March 15, 2016

Module 2

A funny thing about module two is when you digitally manipulate an image it's digitally manipulated.

Horizontal Mirroring can have an unexpected outcome.

What is that, am I swimming in a very reflective lake? If you turn it sideways it looks kinda like a happy gnome-like creature.

Tuesday, March 8, 2016

Python!

I'm currently finishing a Java data structures class and beginning a Python one, albeit one with Jython lessons, which gives me a nice opportunity to neatly juxtapose the two languages.

Studying JPG, GIF, TIFF, PNG, BMP and learning when to use each one was very useful.  It was interesting to read that JPG is the format of choice on the internet due to its excellent quality even at rather high compression settings.  It seems not to be lossless, or, rather, it is lossy, but it does discard information that the eye is least likely to notice.

Manipulating digitized image's pixels with code on JES was pretty fun.  I had only used software programs to do such in the past.  Thanks to CSUMB, I currently possess an Adobe Creative Cloud Membership, so I also took some time to explore the Photoshop app's tools and features for manipulating images, which was decidedly more difficult than writing similarly behaving code on JES!  I definitely preferred making my own methods!


Thursday, February 25, 2016

Finals Week! (Module 8)

The last couple of months have flown by, and the final week of our Pro-Seminar is arrived.  Most of all, I've enjoyed the opportunities to conduct some research on the tech. industry, further hone my writing skills and study techniques, develop some detailed career and academic plans and goals, study a variety of topics from project management to ethics, and collaborate with a great team.  With respect to Team Enhydra Solutions, I think we're solid and work well together.  Appreciating busy work-life schedules, we've still made time to meet twice weekly for an hour or more.  I look forward to continuing our collaboration in our Python/Multimedia class during the next module.  Otherwise, perhaps one of the most important accomplishments I've had, or what I will take with me is a greater sense of ease when it comes to publishing content online.  I'll take that and my improved writing skills with me to my next job, and I'll take up blogging on other subjects than my academic progress....

Our final videos:









Tuesday, February 23, 2016

Module 7

Team Enhydra Solutions has been voted to produce a video on advancements and stalemates in technological development over the decades.  We'll give some consideration to past fantastical forecasts and whether or not they've come into existence or to realization.  Then, we'll look forward to some of the current predictions for the future.  We're currently listing the specific projects we'll recount and the special media we'll try to use.

I watched a few videos with similar subjects to ours on Ted.com.  The first was "Bill Joy: What I'm worried about, what I'm excited about" (Joy, 2006).  He begins by speaking about the way that technologically simple solutions can still provide benefits for mankind, and still greater, as in the case of clean water sourcing having a greater effect on a population's health and longevity than access to antibiotics, which is certainly not to belittle the immense contribution of the latter.  He additionally adds that technology has created an asymmetrical situation in which single actors have a powerful capacity through technology to cause tremendous damage.  He mentions the developments in genome sequencing that enable digital development of eradicated diseases for study, if not immediately nefarious ends. It's worrisome that it is so difficult to defend against terrible acts which have become much easier to commit on a massive scale.  Joy argues that while these threats are severe, "we can't give up the rule of law to fight an asymmetric threat, which is what we seem to be doing because of the present, the people that are in power, because that's to give up the thing that makes civilization" (Joy, 2006).

By his efforts to redress some of the problems upon which he cogitates, Joy argues that we now need to design the future and attempt to lessen the risk of these problems or the probability that they will come to pass.  He is confident that improvements can be made in the following fields, upon which he focuses: education, environmental protection and pandemic biodefense.  Firstly, Moore's Law will practically result in affordable computers for educational use.  Secondly, new materials developed that are lighter and stronger, such as nanotubes, will have effects on the environment: "[they] can make water, ...fuel cells [that] work better, ...catalyze chemical reactions, ...cut pollution and so on. Ethanol -- new ways of making ethanol. New ways of making electric transportation. The whole green dream -- because it can be profitable. And we've dedicated -- we've just raised a new fund, we dedicated 100 million dollars to these kinds of investments. We believe that Genentech, the Compaq, the Lotus, the Sun, the Netscape, the Amazon, the Google in these fields are yet to be found, because this materials revolution will drive these things forward" (Joy, 2006).

Another video I viewed was "Joi Ito: Want to innovate? Become a "now-ist"(Ito, 2014).  In it, Ito discusses the changing paradigm of technological development, so that after the development of the internet, innovation is happening from the ground-up and with greater rapidity or immediacy.  Now, the developer's maxim might not be 'publish/plan or die', but 'develop or die.'  He elaborates,  "...so it's happening in software and in hardware and bioengineering, and so this is a fundamental new way of thinking about innovation. It's a bottom-up innovation, it's democratic, it's chaotic, it's hard to control. It's not bad, but it's very different, and I think that the traditional rules that we have for institutions don't work anymore, and most of us here operate with a different set of principles. One of my favorite principles is the power of pull, which is the idea of pulling resources from the network as you need them rather than stocking them in the center and controlling everything" (Ito, 2014).  Ito was able to resolve a problem himself using the communicative, immediate and organic nature of development through the internet.  After the 2011 9.0 earthquake in Japan damaged the Fukushima nuclear power plant, Ito wanted to know the radiation levels that no government or NGO were able to provide or make available because his family resided only 200 km. away.  Right away, he took matters into his own hands with fellow concerned internet-users around the world, and this without prior planning, infrastructure or supplies. "Three years later, we have 16 million data points, we have designed our own Geiger counters that you can download the designs and plug it into the network. We have an app that shows you most of the radiation in Japan and other parts of the world. We are arguably one of the most successful citizen science projects in the world, and we have created the largest open dataset of radiation measurements" (Ito, 2014).


References

Ito, J. (2014, March).  Joi Ito: Want to innovate? Become a "now-ist. [Video File].  Retrieved from
     http://www.ted.com/talks/joi_ito_want_to_innovate_become_a_now_ist

Joy, B. (2006, February). Bill Joy: What I'm worried about, what I'm excited about. [Video file].  
     Retrieved from http://www.ted.com/talks/bill_joy_muses_on_what_s_next/transcript?language=en