Skip Navigation
UT wordmark
College of Liberal Arts wordmark
lacs masthead
lacs masthead
Robert Vega, Director FAC 18 / 2304 Whitis Ave. Stop G6200 78712-1508 • 512-471-7900

Faculty Profiles - Linguistics

Dr. Katrin Erk – Computational Lexical Semantics
Dr. Stephen Wechsler – Syntax and Lexical Semantics


Academic Background: Ph.D., Engineering, Computer Science Department & Postdoctoral Studies at the Computational Linguistics Institute, Saarland University – Saarbrücken, Germany; Diploma in Computer Science, Koblenz University – Koblenz, Germany

What made you decide to go to graduate school?
I somehow always wanted to. Learning things was fun, and doing research sounded cool. And it could not possibly be difficult to get a faculty position at a university, now could it? Astonishingly enough, this non-plan worked out for me. But I would not recommend doing things this way. If you are thinking about going to graduate school, better plan ahead and research your options.

What was your dissertation topic when you were in grad school?
I was working with a formalism that allowed for a partial description of logic formulas, leaving some parts of the formulas free-floating. You could also specify, for pairs of pieces of a formula, that they should have the same structure. The question was: How can we automatically compute the list of formulas (if any) that are covered by such a partial description? This formalism can be used to describe what are called ellipses in natural language. For example, in "Susan has read all books by her aunt, and Mary has, too", whose books has Mary read?

What is your area of specialization?
My current main research area is computational lexical semantics: building computational models for what words mean, and how they combine. Words are strange and slippery things. Their meaning is hard to pin down, and often they can have multiple meanings that are difficult to differentiate. For example, you can see a bird in the sky, or see that I am right, or see a doctor, or see a new boyfriend, or see to it that the grass gets mowed. All these uses of "see" are somehow related, but also different. I am working on computational models that would describe this vagueness as exactly as possible. So, the questions are: How can we describe how similar or different those uses of the word "see" are, and how can we infer the clouds of associations that comes with each of those uses?

Another research area of mine is the creation of language resources for lexical semantics: Computational linguistics today crucially relies on machine learning. For that, we need large amounts of natural language text that has been manually labeled with linguistic information as a basis for learning. But what should those labels look like to describe the phenomena best, and to best facilitate learning?

In my earlier research, I worked with logic-based representations of meaning. They are great in that they can represent complex statements with all their structure, and because they serve as a basis for automatic reasoning. But they are tough to derive because of that same complexity that makes them so powerful.

What topics do you teach at UT?
I teach undergraduate and graduate classes on computational linguistics (both introductory and advanced), and graduate courses on programming and statistics for linguists and students from related areas. I also teach a graduate seminar that takes an interdisciplinary look at word meaning and mental concepts.

What is your current research focus?
In the Isogram project, we are using what are called semantic space models to describe the meaning of words in context. These models represent a word as a point in a semantic space with thousands of dimensions, which stand for (possibly related) meanings or associations. One big advantage of these models is that they can be learned automatically from data, so word meaning does not have to be specified by hand. Another main advantage is their flexibility: They can represent similarity and association between words in a graded and flexible manner. We are also looking at linking these semantic space models to logic-based representations of meaning, which are great at representing complex, structured arguments, but do not deal well with similarity.

Is there a hot topic currently being discussed by computational linguistics scholars in the U.S. or around the world?
In computational linguistics, one current hot topic is how to automatically extract knowledge from the incredible resources created by social media, especially Wikipedia. Another is what is called Textual Entailment: Given two sentences, would a human say that the first implies the second? This is an exciting topic because it underlies a wide variety of language technology applications, and because it requires handling so many different linguistic phenomena at the same time. Then, grounding is an up-and-coming topic: In computational linguistics, we often describe a word through other words. But humans "ground" meaning through their perceptions and memories. So how can we link words to non-linguistic things, and what can we do with those links when we have them? In terms of applications, sentiment analysis is currently getting a lot of attention: Can we automatically determine whether people on the web are saying positive or negative things about a given product? And an application that has been and continues to be highly important and challenging is machine translation.

Did you participate in a research project as an undergraduate? And would you recommend research for undergrads
I actually participated in multiple research projects. One was about theorem proving -- formalizing a line of reasoning, then automatically testing it for validity. Another was on DNA computing. Imagine a "computer" that is just a test tube full of goo. You encode data into strands of DNA, and perform computations by slicing, recombining and sorting those strands. I also helped a professor write a textbook on theoretical computer science, which was a great experience for me because it let me practice describing the ideas behind definitions and proofs in an intuitive way.

Should undergrads participate in research? Yes, absolutely! Participating in research will help you see what research is like, and if it is what you want to do. You learn useful skills for your later research career. And you gain valuable bullet points for your graduate school application.

What makes a good grad student?
The most important thing, I think, is to be curious and to love to learn new things. A graduate student also needs tenacity, and a lot of it, because most experiments don't work out the first time around, and maybe not the second time either.

What are your top three tips for students interested in applying to a linguistics graduate program?

  1. Find out whether the department is a good fit for you. Think about what your core research interests are, and who in the department you might want to work with. And tell us about all this in your application. It helps when we see that you have given serious thought to your plans for your graduate studies.
  2. When reading your application, we will be trying to find out whether you will make a good graduate student -- that is, whether you have both the curiosity and the tenacity that it takes. Undergraduate research helps. So do great recommendation letters. So does a great application essay.
  3. If your core interest is in computational linguistics, it will be extremely useful to have some background in computer science or mathematics.

What are the top five computational linguistics graduate programs?
Stanford University, Johns Hopkins University, University of Pennsylvania, University of Southern California Information Sciences Institute, and Brown University are very strong in computational linguistics. Internationally, Saarbruecken University and University of Edinburgh are also very strong.

What careers do alumni generally pursue after graduation from the program?
Among our linguistics students in general, the majority go on to pursue research careers, typically as faculty in Linguistics departments. In computational linguistics, students have a choice between academia and industry, as there are currently many positions in industry for computational linguists. For an overview of where our most recent graduates went, see ourGraduate Placement page.

Download Dr. Erk's Profile


Academic Background: Ph.D., Linguistics, Stanford University – Stanford, CA; B.A., English, University of California at Berkeley – Berkeley, CA

What made you decide to go to graduate school?
A practical reason is that I wanted to pursue an academic career in linguistics. A personal reason is that I wanted to achieve as much as possible in my chosen area; I did not want to compromise. So that meant getting the highest degree available.

What was your dissertation topic when you were in grad school?
My dissertation deals with the relation between word meaning and syntax. Specifically it concerns the way the meaning of a verb influences how it can be combined with other words to form a sentence.

What is your area of specialization?
My work focuses on three research areas: the semantics of first- and second-person indexical pronouns like ‘you’ and ‘I’; systems of grammatical agreement in different languages; and the interface between word meaning and syntax.

What topics do you teach at UT?
I teach courses in two areas: syntax; and lexical semantics (word meaning).

What is your current research focus?
Currently (Fall 2010) I am engaged in research projects in two areas. First, I am investigating the origin and grammatical properties of an unusual verb conjugation phenomenon in Hungarian: in that language a special form of the verb is used if the verb has a direct object that is definite (‘the bird’, as opposed to ‘a bird’). Second, I am working on a typological study-- that is, a study comparing a lot of different languages-- on the question of why verbs often agree with their subjects in person, while adjectives usually do not agree in person, even when they agree in number and gender.

Is there a hot topic currently being discussed by linguistics scholars in the U.S. or around the world?
One big question being discussed these days is whether and to what extent the mental representation of language involves a discrete combinatorial system in which symbols combine according to rules. Basically this question concerns the ‘squishiness’ and variability of language.

Did you participate in a research project as an undergraduate? And would you recommend research for undergrads
I wrote an undergraduate thesis on the poetry of William Carlos Williams. I recommend that any undergrads who hope to continue to graduate school should try to do some original research. One reason is that you can use the product of that research as a writing sample when you apply to graduate schools. Also it may help you decide whether in fact you want to continue in a field.

What makes a good grad student?
Patience, hard work, and a taste for Top Ramen.

What are your top three tips for students interested in applying to a linguistics graduate program?

  1. Your Statement of Purpose should show that you know what linguistics is.
  2. Do some research while still an undergraduate (see above); if possible include a writing sample that illustrates your research experience.
  3. Look at your application file from the perspective of someone who must read many, many such files: make it clear and readable, and don’t clutter it with irrelevant information.

What are the top five linguistics graduate programs?

  1. University of Texas at Austin (of course)
  2. Stanford University
  3. Ohio State University
  4. University of California at Santa Cruz
  5. Massachusetts Institute of Technology

What careers do alumni generally pursue after graduation from the program?

  1. Academic careers: research and teaching in linguistics.
  2. Computational linguistics careers.
  3. Careers related to foreign language teaching.

Download Dr. Wechsler's Profile

bottom border