June 7: Computational Thinking

Intro to Coding:



Background Info on Coding:
Before starting on coding today in class, I had very little experience with coding and using technology in general. The only time I had practice with coding was in middle school. All I knew was that coding was a sequence of instructions programmed into a computer for it to perform a certain task. I also knew they were different coding 'languages' like python, etc. 






                                                                           
Today's Project:
Out of the three options given, I chose the beginner project (Learn Python - From the beginning!) because I had no previous experience with coding. We were to complete 6 interactive lessons that teach us how to use Python and help in learning how to code. I was able to complete three lessons but still have three left to go. 



What we Learned in Class:


By completing the lessons, I learned more about Python coding. One thing in particular that I remember learning is illustrated in the picture to the right.  In Python, each character in a string has an index starting from 0. For example, in the string "cats", the character at index 0 is "c", and in "Ryan", the character at index 3 is "n". This indexing lets you  access specific characters in a string using their positions. The step was asking me to assign the variable fifth_letter  to equal the fifth letter of the string “MONTY”. At first, I kept getting the step wrong but then I realized that the fifth letter is not at index 5 because you start counting your indices from zero. 









Chat GPT Reflection


Based on our research with Chat GPT, we reached the conclusion that it is biased. Here are the reasons why we believe that:  

  • Time magazines states that Chat-GPT might biased due to it mainly being created by the male population.  
  • It was created by humans and the humans that created it do have specific viewpoints and also don't know everything.
  • In addition, after finding differences between Chat GPT's explanations and our explanations, we realized that Chat GPT uses more advanced words where in comparison our explanation used common words which made summaries simpler and easier to read and understand. 
  • Chat GPT left out vital examples from the articles and didn't do a good job summarizing all the crucial points in the summary and it seemed it just picked what it felt was important 
  • There are not a lot of women (specifically women of color) working in technology development and that affects the bias shown by Chat GPT

Comparing 

Our Summary: 

"The Time magazine and Vox discuss how machines can discriminate in harmful ways, negatively impacting individuals. The Time Magazine talks about how machines are better at recognizing white, male faces compared to colored women, illustrating how they can be racist and sexist. The Vox article emphasizes how algorithmic bias occurs when an individual builds and trains a website based on their own knowledge and understanding. After the computer understands how to complete a certain task, it will be able to complete the task way faster than humans. This is the reason computers and online applications are more preferred by everyday individuals. Both articles emphasize the reason application bias is an important issue to be addressed is because if the application is primarily built by men, they would have more bias and discriminate against women."


Chat GPT Summary:

"Both articles dive into the pressing issue of bias in artificial intelligence (AI), especially around race and gender. The Time article explains how AI systems, like facial recognition, often get things wrong because they're trained on biased data sets. This means they make more mistakes with women and people of color. Over at Vox, the article talks about how secretive AI systems can be and points out examples where algorithms in hiring, law enforcement, and healthcare have been unfair. They call for more transparency, diverse data, and better ethical standards. Both pieces highlight the urgent need to tackle these biases so that technology doesn't reinforce existing social inequalities. A big takeaway from both is the importance of using inclusive data to train AI, which can help make these systems fairer and more accurate across the board."



Links to Articles that were used for Summarizing: 




  


















Comments

Popular posts from this blog

June 17: Visualization and AI Overview