Calvin, Ogulcan, Rosa Rationale for survey REVISED

Calvin, Ogulcan, Rosa

There are many surveys about student experience at any educational institution. However, most of these surveys are very broad and don’t focus on empowering students to express themselves fully. Most of the time students don’t give the most candid answers and often times they can be quite biased. Our survey aims to identify what aspects of faculty matters the most to students at City Tech and aspects of faculty the school lacks. From the responses, we hope to gain a bigger picture on what students are satisfied with in terms of faculty and what the faculty needs to further improve on.

 

There were numerous revisions that we went through in order to achieve the correct focus for our survey. In the beginning, our intention was to create a survey purely to evaluate the effectiveness of professors similar to how ratemyprofessor, ratemyteachers or operates. However, the problem that existed was that the feedback and ratings that students give to professors are too, subjective, too general, and the data that we were to receive would not be accurate and would not really want to measure what we intended it to.

 

Our second revision was focused on student satisfaction on different aspects of the school such as education, faculty and services. However, the problem that we encountered was that there were too many variables to take into account. Therefore, it made the survey too lengthy for students and it was too similar to already existing surveys such as CUNY AIR and the NSSE. We did not want to create an inferior survey that would measure close to the same thing. Another obstacle we encountered with it was that we always asked ourselves what was the purpose? This made it really difficult to proceed with this revision.

 

Going back to the drawing board for a new direction with our project, we researched on teacher evaluation surveys, what they measure, what they intend to measure, and why they aren’t as effective as they can be, and potential solutions to these problems. What we wanted to do was to create a new version of the City Tech teacher evaluation survey but because of political reasons, were not able to fully pursue it and have it be the main focus of our project.

 

What we have now is a survey with open ended questions with flexible answers that asks questions that pertain to faculty. How this differs from the original ideas we had is that the questions are much more meaningful, and they are much more specific. What we hope to achieve from the data is to find out what are the more important elements in City Tech for students, aspects in faculty that the school needs and can improve on, and find a trend in the overall satisfaction of faculty.

 

Moving on to the rationale, we want to know the relationship between students and faculty. Does faculty really have respect for students? Do they affect how you get your education at City Tech? Are faculty there to help you when you need it the most? Each one of our questions will specifically target the faculty and how to improve on it.

 

The first question asks about which factor matters the most to students about faculty. This question is important to us as it will help us understand what aspects impact students the most when evaluating a faculty member.

 

The second question is a scale question and it asks how seriously students take the student teacher evaluations forms at the end of the semester. Most of the time we have seen students give high ratings to professors when they did not deserve it. We want to know if students take it seriously or not and if they put time into it.

The third question asks the students how does the faculty affect the courses you register at City Tech? This is a huge question because faculty plays a huge role in students academic careers. Students want the right professor for their classes. There are professors who have full respect to students and some have none. Our survey will basically identify the elements behind it.

 

The fourth question in our survey ask about what difficulties students have encountered so far with faculty at City Tech? This question is important to our survey because it will help reveal some of the problems that students that have encountered that might undermine their experience with faculty at City Tech. This will also show us what faculty needs to work on.

 

Our fifth question asks about some of the problems that students might have encountered when they try to talk with the department chair. Because the chair is the head of the department, she has control of courses, programs and oversees the faculty. Therefore, students need to be able to communicate with her if they have any issues.

 

The six question we ask student what they want to see improved about the faculty in terms of teaching. This question will help us reveal what are some of the practices that faculty can develop more on to help better convey the material to students.

.

Our seventh question asks what would you like faculty do to keep students more interested/engaged in lectures and classes. This question will reveal some of the things students want to happen in class in order to retain a student’s interest. This will help students want to learn more and also help them learn more.

 

Our eight question asks about the quality of academic advisement. There are many times where students come for advisement and end up leaving without proper guidance. Many times the faculty doesn’t advise properly and just agree on whatever the student may ask but that doesn’t work since the student is the one with the problems. We want to know if students are getting properly advised and this is why we are asking this question.  This question will help us to create a virtual representation and get a comprehensive idea of how students feel about the faculty in general.

 

The ninth question will be a question that asks on a scale how satisfied are they overall with the faculty. This will be a support question for our main topic. It will be a backup to our previously asked questions.

Survey Rationale- Derrick,Lian,Deniel,Chris

Survey questions Rationale:
1. What is your major?

By knowing the participants area of study, we get an idea of what their focus is and how likely the computer club may appeal to them

2. Are you already a part of another club here at City Tech?

Since the designated times for club hours is the same, if a student is already in a club the chances of them joining the computer club would be adversely affected

3. Rate in order from 1-4 (4 being the highest and 1 being the lowest) which factor would influence you the most when considering joining a club?

We want to know what influences students most to join a club and what the computer club can do to attract more students

4. Do you think the curriculum provided by City Tech fully prepares you for your career?

To find out if students feel like they need more help which the computer club can in turn provide

5. Are you interested in becoming certified in your field after you graduate?

Certification assistance was one of the things suggested that the computer club could provide, so the question is aimed to see the students interest

6. If you were to join the computer club, what would your expectations of the club be? (Check all that apply)
Development of computer skills Workshops Speeches

This question seeks to find out what is expected when joining the computer club so that students may feel compelled to join and actually take part

7a. Did you know that City Tech has a computer club prior to reading the memo?
b. If yes, how did you hear about it? Select all that apply.
Friend Website Bulletin Board Teacher

From the question above, we will find out exactly how prominent the computer club was before we informed the students about it. From the answers to part b, we can then recommend to the computer club areas to better advertise themselves.

8. Having read the memo, would you consider joining the computer club?

The last question gauges the overall interest of students towards the computer club after learning more about it through our memo

Calvin, Ogulcan and Rosa Rationale for Survey

Calvin, Ogulcan, Rosa

There are many surveys about student experience at any educational institution. However, most of these surveys are very broad and don’t focus on empowering students to express themselves fully. Most of the time students don’t give the most candid answers and often times they can be quite biased. Our survey aims to identify what aspects of faculty matters the most to students at City Tech and aspects of faculty the school lacks. From the responses, we hope to gain a bigger picture on what students are satisfied with in terms of faculty and what the faculty needs to further improve on.

There were numerous revisions that we went through in order to achieve the correct focus for our survey. In the beginning, our intention was to create a survey purely to evaluate the effectiveness of professors similar to how ratemyprofessor, ratemyteachers or operates. However, the problem that existed was that the feedback and ratings that students give to professors are too, subjective, too general, and the data that we were to receive would not be accurate and would not really want to measure what we intended it to.

Our second revision was focused on student satisfaction on different aspects of the school such as education, faculty and services. However, the problem that we encountered was that there were too many variables to take into account. Therefore, it made the survey too lengthy for students and it was too similar to already existing surveys such as CUNY AIR and the NSSE. We did not want to create an inferior survey that would measure close to the same thing. Another obstacle we encountered with it was that we always asked ourselves what was the purpose? This made it really difficult to proceed with this revision.

Going back to the drawing board for a new direction with our project, we researched on teacher evaluation surveys, what they measure, what they intend to measure, and why they aren’t as effective as they can be, and potential solutions to these problems. What we wanted to do was to create a new version of the City Tech teacher evaluation survey but because of political reasons, were not able to fully pursue it and have it be the main focus of our project.

What we have now is a survey with open ended questions with flexible answers that asks questions that pertain to faculty. How this differs from the original ideas we had is that the questions are much more meaningful, and they are much more specific. What we hope to achieve from the data is to find out what are the more important elements in City Tech for students, aspects in faculty that the school needs and can improve on, and find a trend in the overall satisfaction of faculty.

Moving on to the rationale, we want to know the relationship between students and faculty. Does faculty really have respect for students? Do they affect how you get your education at City Tech? Are faculty there to help you when you need it the most? Each one of our questions will specifically target the faculty and how to improve on it.

The first question asks about the most important thing a student has to city tech? This question is important in our project because this is the introductory part of the survey. It will help connect to the other questions we will ask.

The second question in our survey ask about what matters to you the most about instructional staff at City Tech? This question is important to our survey because it will help us as well as the faculty members to find out what are the strengths and weakness aspects on their staff. This will give an idea on what can the faculty staff improve.

The third question asks the students how does the faculty affect the courses you register at City Tech? This is a huge question because faculty plays a huge role in students academic careers. Students want the right professor for their classes. There are professors who have full respect to students and some have none. Our survey will basically identify the elements behind it.

The fourth question we ask student what they think the faculty lacks? As we already know the availability, promptness, teaching practices as well as good communication with the students  are primary responsibilities of some of the faculty members, therefore the intention with this question is to get a sense of which of those duties needs to be improve and from there figure out ways to make it more efficient.

Our fifth question asks what would students like to see improved about the faculty in City Tech.This question will help us see what is the main issue between students and faculty.

Our sixth question asks how satisfied you are with faculty in City Tech. This question will help us to create a virtual representation and get a comprehensive idea of how students feel about the faculty in general.

The final question will be a question that will ask the student to describe what will be their ideal faculty member. This is the question that will give us the biggest understanding on what aspects are the most important and have the most impact on a student.

 

Jason, Ray, Vincent, Revised Survey

Our survey can be found on Dropbox.

The survey asks questions about our game, and our manual, in attempt to determine the usefulness/accessibility of the manual and also the effectiveness of the game.  The data will provide us with an idea of how our usability testing went, understanding what we did right and what we did wrong with regards to both the game and the manual.  We will show our survey results in the final write-up and final presentation, probably with a graph of some sort.  The survey will be short with only 17 questions, many of which are optional.  It should take less than ten minutes to complete.  Below we will list the rationale for each of the questions.

  1. “Did you read the user’s manual?” – A vital question, if they didn’t read our user’s manual then many of our other questions can’t be answered, and we would get no data on this part of our project.
  2. “On a scale of 1-5, how difficult was it for you to navigate through the user’s manual (5 being most difficult)?” – This question will of course be skipped if the person taking the survey answered “No” to #1.  This question is also very important since it addresses the general accessibility of the manual.
  3. “Did the manual contain all the information you needed about system requirements?” – A simple Yes/No question that will determine whether or not we missed any vital information about this part of the manual.
  4. “If you answered “No” to question number 3, what didn’t you find in the manual about system requirements?” – A much needed followup to question #3.
  5. “Did the manual contain all of the information you needed to install the game?” – Like question #3, but addressing another crucial part of the manual.
  6. “If you answered “No” to question number 5, what information didn’t you find in the manual about installation?” – An optional followup for #5.
  7. “Did the manual contain all of the information you needed to configure Dungeon Escape’s controls, graphics, and sound to your liking?” – This is similar to questions #3 and #5, but addressing yet another part of the manual.
  8. “If you answered “No” to question number 7, what information was missing from the manual regarding configuration?” – A followup just like #4 and #6.
  9. “Did the manual contain all of the information you needed in order to get a basic understanding of the gameplay of Dungeon Escape?” – Like the previous questions, this will let us know whether or not this section of the manual is thorough.
  10. “If you answered “No” to question number 9, what gameplay information didn’t you find in the manual?” – Last one, we promise.
  11. “On a scale of 1-5, how frightening did you find the game to be (5 being most
    frightening)?”
    – In essence, this lets us know whether or not we did a good job designing the game.  It’s a horror game so it is meant to be scary.  This tells us whether or not the game was effective at delivering terror into the minds of our audience… they should remember the game forever.
  12. “On a scale of 1-5, how difficult did you find the game to be (5 being most difficult)?” – This sets up for #13.  With both of these questions we want to find out the difficulty tolerance that our audience has, and find out if we should consider trimming down the difficulty by a notch (which Jason would probably refuse to do).
  13. “Would you say the game is too difficult?” – We are considering integrating this into question #12’s numerical scale, such as just making 5 being “too difficult”.
  14. “What aspects of the game did you find to be most difficult?” – It will be useful to know which aspect of the game is the most difficult.  Combined with questions #12 and #13, it could give is an idea of what area of the game we should consider making less difficult.
  15. “Did you enjoy playing Dungeon Escape?” – We couldn’t make this survey without asking this.
  16. “Would you recommend this game to others?” – Like the previous question, this question is basically a requirement.  If they say yes that means in theory a game like this would sell, which means we did a good job!
  17. “What kind of gameplay changes, if any, would you have liked to see in the game?” – Their input here will give us an idea of what gameplay features they’d like to see (or existing ones they’d like to see removed).

We made many revisions to our survey.  The first one was the addition of a short paragraph at the top of the survey, that states its purpose.  Afterwards we rearranged the order of the survey to better fit this statement.  Questions about the manual now come first, and these questions are in a logical order (sort of like reading through the manual from front to back).

Questions relating to the game itself are now grouped after the questions about the manual.  We followed tradition in putting questions like “Did you enjoy the game?” and “Would you recommend this game to others?” at the end of the survey.  Another big revision involved breaking down one question into several, more specific ones.  Looking at the rationale listed above, questions 3-10 all stem from one original, broad question we had, which was “Did the manual contain all of the information you needed?”  We took this question and broke it down, making them revolve around specific topics within the manual.

These revisions make the survey more professional, more clear, more detailed, and they will provide us with more clear and detailed data.  We’ll now understand what specific aspects about the manual need improvement.  Our survey will be packaged with our manuals, which will be distributed via printed copies.  However, if we have enough time we’ll try to gather even more data by letting others download our game from the internet, complete with the manual and the survey, and send the survey results back to us.  This wouldn’t be a public assessment though, Jason would send the game and manual/survey to distant friends and contacts of his, and get the results back from them.  This isn’t a necessity for our project but more data couldn’t hurt.

Jason, Ray, Vincent, Survey first draft

Note: The actual survey will be distributed via printed, physical copies.

Dungeon Escape
Survey by Jason Choy, Ray Chen, Vincent Cornelio

1. Did you enjoy playing Dungeon Escape?

___ Very much so ___ Yes ___ Unsure/neutral ___ No ___ Not at all

2. On a scale of 1-5, how frightening did you find the game to be (5 being most frightening)?

[1]           [2]           [3]           [4]           [5]

3. On a scale of 1-5, how difficult did you find the game to be (5 being most difficult)?

[1]           [2]           [3]           [4]           [5]

4. Would you say the game is too difficult?

___ Yes           ___ No           ___ Uncertain

5. What aspects of the game did you find to be most difficult?

___ Puzzles   ___ Hostile creatures  ___ Getting lost   ___ Running out of light   ___ Other

6. Would you recommend this game to others?

___ Yes           ___ No           ___ Uncertain

7. Did you read the user’s manual?

___ Yes           ___ No           ___ Somewhat

8. On a scale of 1-5, how difficult was it for you to navigate through the user’s manual (5 being most difficult)? Skip this question if you answered “No” to question number 7.

[1]           [2]           [3]           [4]           [5]

9. Did the manual contain all of the information you needed? Skip this question if you answered “No” to question number 7.

___ Yes                     ___ No

10. If you answered “No” to question number 9, what didn’t you find in the manual?

 

11. What kind of changes, if any, would you have liked to see in the game?

 

12. What kind of changes, if any, would you have liked to see in the manual?

An Application for Citytech – Survey

Please complete the survey with this link  –   Survey Link 

NYCCT

An Application for New York City College of Technology
* Required
How often do you visit Citytech’s website? *
By knowing how frequently the students visit Citytech’s website, it’ll allow us to know how reliable the respondent’s reply will be.
This is a required question
When you visit Citytech’s Website. What page do you visit the most often? *
We will know which website page to optimize the view for our application.

This is a required question
Rate the difficulty of navigating the college website. *
It gives us a general idea of people’s impress of our mobile application. Since our goal is to make the app simpler to use than the website. If they find the site difficult to use, people may like our app better.
Very difficult
Very easy
This is a required question
If a mobile application is built for Citytech, would you use it? *
A very important question, when we create the mobile app for Citytech. Will people use it? If this application is not popular, then creating this app may not be beneficial to the school, and this project will be canceled.
This is a required question
Do you own a smartphone? *
Find the percentage of smartphone user. Which leads to the next question.
This is a required question
If yes to the previous question. What operating system is used by your smartphone?
This determines what mobile platform we will create the app for first.
Check all that applies.
Would you like to communicate with your fellow students within the app? *
ex. Selling/buying textbooks.
It is one of our planned feature. If people like it, we can continue developing and make it usable.
This is a required question
Scale your interest in reading Citytech’s news. *
Another planned feature.
Not interested
Very Interested
This is a required question
Is there any feature you would like to see in the application?

Comments and Questions.

Guidelines for Fieldwork: Surveys

[Surveys]

As discussed in class, for this week, you are moving from traditional academic research to fieldwork (or qualitative, ethnographic work). This is an organic process, so the questions you ask for your interviews (if you are doing them) should develop from the data your survey generates, and the survey questions should be informed by your initial observations.

For your survey, you will be using an online tool to create/distribute the survey. I suggest SurveyMonkey, as it is free/easy to use, but you are welcome to use another tool if you feel more comfortable with that (just send me an e-mail to let me know what it is before getting started). If you don’t already have account, you should create a free one: just one account per group. Use one person’s e-mail, and make a password that you all can use (whoever has the account can then go back after the semester is over and change the password to something more personal/private).

Keep in mind that with SurveyMonkey there is a maximum (with the free, basic account) of 10 questions and 100 survey responses. Another alternative that seems to offer unlimited questions/responses is KwikSurveys. If you feel this would be more suitable for your purposes, then I would suggest creating the revised survey using this software (especially if you want more than 10 questions or think you will get more than 100 responses).

As we’ve experimented with different online survey tools, there seem to be some advantages to using KwikSurveys, such as the unlimited questions (you can ask) and the unlimited number of response (e.g., SurveyMonkey only allows 100 response to any one survey, with the basic account). Also, in KwikSurvey, you can put that introductory blurb at the top of the survey. However, there are some important features that are only available in KwikSurvey if you upgrade and pay $10/month (something you don’t necessarily need to do), so your group has to decide how to play around with them. Again, there are other online free survey tools that you can play around with too. Report back if you find anything useful that we should know about (just “comment” on this post).

If anyone wants to suggest other software (and do some research on other viable options), please do so as a comment/reply to this post.

Before you get started, you should read a bit about how to create effective surveys. A good place to start is this help page from SurveyMonkey. At this point, you are not launching your survey yet: in class on Tuesday we will peer review the surveys and make necessary revisions, and you can make it go live after that. You should have a minimum of 10 questions for your survey.

For this survey, you should make sure to include a statement of use, which discusses what the project is, what the data that is being collected will be used for (class project), and that people’s information will remain confidential/anonymous. Including this statement and having users participant through “informed consent” is an important part of the research project and any fieldwork (we will discuss this more next week with the interviews, for which you will actually use consent forms before you interview anyone).

You should revise your survey according to the class feedback/discussion. Some things to think about in terms of revisions:

  • Think through the order of your questions (is it a logical progression)
  • Which questions are you going to make mandatory (require a response for, and which ones will be optional)?
  • What types of questions (comment boxes/free responses, multiple choice, ranking, etc.) will you use? Remember that different types serve different purposes: think about the data you want to collect, and how the types of questions will encourage (or discourage) participation in the survey.
  • Be as specific as possible (if you mean the City Tech Computer Club, state that: not just the “Computer Club”)
  • You may even consider having multiple surveys (for example, a group working on smoking on campus would consider having two separate surveys: one for smokers, one for non-smokers)
  • If relevant, demographic information (such as age range, gender, etc.)
  • proofread/edit your survey

I made comments on each group’s surveys, but here are some general thoughts for revision:

  • It is a good idea to inform your participants (in that introductory blurb) how long the survey should take. Since the surveys should only take a few minutes to complete, this should be a good way to convince people to take it. You can do this in SurveyMonkey. To do so, at the start of the survey, click “Add New Page” and then you can put a title and text explaining the survey.
  • It is also a good idea to provide some background/contextualizing information for the survey/topic, so your participants can understand that survey.
  • Make sure you indicate (by choosing this setting when creating the questions) whether the question is mandatory/required or not. I would suggest making the majority of the questions mandatory (which means that the survey cannot be answered without participants answering them).
  • Remember that satisfaction/dissatisfaction questions (do you like/not like something) are useful, but only to a certain extent. General dissatisfaction only confirms there is a problem, but doesn’t provide insight into how to solve that problem (and remember that the goal of the project is to provide context-specific recommendations for improvement). Therefore, make sure to ask questions that are more specific.
  • As we discussed, it would be helpful to have an open-ended (free response/text box) question at the end of the survey, that says something like, please list any additional questions/comments related to the topic. You will likely get interesting information here that you didn’t originally anticipate, and that may be useful for your project.
  • Make sure to delete all the data (reset the surveys) from your peer reviews this week. When you launch the survey, there should be no responses. Do not edit questions once the survey is launched (this will taint the data).
  • Publicize the survey as much as possible: through social media, e-mail, in person. You can print out the blurb/link to the surveys and hand them out in classes/on campus, so people can go take the survey. You can also walk around campus with tablets/mobile devices and ask people to take the survey on the spot. You can also print out surveys and ask people to write them out/give them back, but remember that you will then need to input the data yourself. Everyone should aim for at least 60-80 responses (remember, SurveyMonkey will only let you get 100 responses).
  • Each group will launch/publicize its survey no later than the week of Thanksgiving, and aim to have preliminary results to report by the next class after the break.

As always, I am available to meet with your group outside of class to discuss your project and to look at further drafts of surveys.

 

You will make a new blog post by Friday night (11/21) class (categorized as “Surveys”) that:

  • Lists the hyperlink link for the revised survey
  • Provides a paragraph that explains the context of the survey and how the survey data will be used (this was supposed to be done by today’s class/the first draft of the survey, but most groups are missing this crucial section)
  • A collaboratively written reflective discussion (at least 3 full paragraphs) of the revisions you made, why you made them (in terms of the data you want to gather), and how you plan to launch/publicize your survey (whether you will use paper surveys too and then input the data).

This is a collaborative revision: as a group, you should discuss what needs to be changed/why, then revise the survey, and then reflect on those revisions.

By Sunday, 12/7, you will write a blog reporting/analyzing/discussing the results of your survey (make sure to post the link to the final draft of the survey, the one that was launched, at the top of this post). You should include a written discussion as well as statistical results (in both numbers as well as screenshots of your online survey’s tool’s statistical analysis so we can see this data visually represented). Categorize as “Surveys.”