Saturday, August 10, 2013

Response to an Overwhelmed Professor about Tech

Original post
"God do I hate technology"
     I started thinking when I read this post from the blog “The Online Science Educator for the Distance Learning Community.” I believe Geralyn Caplan stands where a lot of educators do when integrating technology into education. She may even be ahead of her peers because she acknowledges her frustration, yet perseveres in including technology. I don’t believe technology is the underlying issue, I believe the rate of change is the frustration of educators today.
     At one of the Universities where I work, the course curriculum did not change for ten years of my instructing in the courses offered for the degree. There were paper modules for students and instructors to follow. The information in the modules for some courses, like management, was out of date. I followed the basic outline of the modules, but added up to date information on subject matter. I made sure I included core concepts from the other courses in the cohort and researched how these core concepts have evolved. Even with this update of information for courses, the way I delivered the information did not change much back then. 
 

     Pedagogy today is similar to how faculty have taught for decades. Active, informed, and problem based learning, peer led discussions and small group work, are all still utilized in the classroom. I will focus on two differences in curriculum development increasing the challenges for instructors. The first is how curriculum content is delivered within the classroom. Differentiation strategies have expanded how students can learn more effectively, broadening pedagogical approaches. The second is integrating technology into differentiation strategies and appropriate digital solutions for the applied content of course concepts. Technology is evolving at a rapid rate. Not only do professors need to differentiate material, but keep up with how technology can benefit students and their profession.
     Learning Management Systems (LMS), like blackboard or D2L, evolve and include new tools. Some schools even switch LMS’s due to effectiveness or expense. Change is not occurring by the decade, change occurs by semester. At times, the rate of change can even be less than a semester if a significant discovery advances course content. Change is not just about research based content anymore, but how the content is delivered and applies to integrating digital solutions in curriculum. 
New research evolves every day. The way curriculum is being taught is shifting.  Students in one of my classes took part in a Twitter debate with three other social work schools across the country. This type of exercise stayed with the students even into the next semester. They spoke about what they learned and how it affected them. This lesson expanded to how they could use Twitter at their internships and where they worked. Technology innovation creates the need for faculty to broaden their horizons and accept change in every field. I had never participated in a Twitter debate, nor was I comfortable with the technology. Twitter? The program seemed useless to me. Because I opened myself to change I found a new way for students to relate to material and increased my understanding of new digital solutions with Twitter. My initial bias was more due to fear of change than the evils of technology.
     So, do most faculty hate technologies or do they hate the rate of change technology brings to their profession? Maybe the place to start with is Universities being aware of the Transtheoretical model of change and educate faculty about integration of technology with these processes in mind.

Monday, July 29, 2013

“Binging” Child Porn Searches


Technology being used to progress the protection of children has started.  Bing is the first search engine to utilize pop-up notices when pedophiles search for lewd pictures of children. The warning will notify if the search is illegal and offer a link for services to provide counseling. The UK plans to block all child pornography by having Internet users opt in for porn. This delivers a strong message that the person accessing this information is being tracked. Google is not participating in this tool to block child pornography. The power of Google could be instrumental in blocking access to sites endangering children.

So my question is why the United States is not accessing this tool? Why only is the Microsoft UK search engine, Bing, participating? How would child porn decrease if all search engines prevented child pornography searching? I believe the implications to using this tool are mostly positive.

I do understand freedom of speech and the concerns with this in the United States. There are also issues with filters and how to get around the wrong searches being targeted. Tech savvy people can probably bypass the blocks put up, but how many will it deter? Isn’t this plan worth protecting children who cannot protect themselves? What about prevention? Interest in child pornography starts somewhere. Making it less accessible may deter some voyeurs.  

I hope there will be studies in the UK to evaluate the effectiveness of such measures.

More information:




Sunday, July 7, 2013

Mood Sensing Software vs. Big Brother

As I write this I must admit I am undecided as to the positive and negative uses of data mining for use with clients. Two articles brought my thoughts to the forefront. The first is about a new dimension to data mining from Microsoft Research in Asia, it is called MoodScope. Identifying the emotions of happy, tense, calm, upset, excited, stressed or bored, MoodScope was accurate 93% of the time with 32 volunteers when adjusted to individual users on their smart phones. Immediately, I thought of how this construct could be shifted into a therapeutic setting with the client and therapist being a team in mood identification and regulation. This is not much different than The Durkheim Project’s use of artificial intelligence to analyze Facebook and smartphone data to statistically monitor harmful behavior of users. Veterans are the first study participants. This project addresses data security and confidentiality between users, therapists, and the information received.
What is the difference between these two projects? The difference is in how they are using the information gathered. I have a twitch in my eye when I read the next paragraph of the MoodScope article:

“The researchers suggest third party hooks could be added to the software to allow for automatically transmitting user moods to applications like Facebook. They also acknowledge that privacy concerns could arise if the software were to be delivered to the public, but suggest the benefits of such software would likely outweigh such concerns. They note that sites like Netflix or Spotify could use data from MoodScope to offer movies or other content based on specific users' moods.”


Advertisers will decide what content we see on smartphones, tablets, or computers based upon their moods? Will a client get an option to see Prozac Nation if they research antidepressants or Leaving Las Vegas if their searches center on where to go to an AA meeting? We can only hope there are not directions to vineyards (with a coupon, no less) in the area. How will the effect of these choices be cataloged as constructive or destructive to a person’s behavior? As technology progresses there needs to be some overall body regulating the innovations of the digital age. 

Advertising is already manipulating through commercialism (Van Tuinen, 2011). Now the manipulation will be even more personal. We know about the influence of commercials for children’s programming, smoking and drinking media targeting and other forms of bias with products. How are we going to advocate for vulnerable populations in the context of manipulation by digital logarithms? Will clients believe the technology is reading their mind? In a sense, this 1984ish George Orwell program is intelligent and watching every stroke made on a smartphone or computer. The big question is “how will you as a social worker be aware of these tools and advocate for best practices ensuring ethical and confidential use of data mining manipulation?”  If you have any suggestions, please let us know!

References






Van Tuinen, H. K. (2011). The Ignored Manipulation of the Market: Commercial Advertising and Consumerism Require New Economic Theories and Policies. Review Of Political Economy, 23(2), 213-231. doi:10.1080/09538259.2011.561558

Your Protest Safety Toolkit: What to Carry, Watch For, and Avoid When Protesting

  We are in unprecedented times for protesting. April 5 th has the potential to be the largest protest in America. In my 40-plus years of a...