Chapter 1: “Defining Discourse Analysis and its Scope for Language
Teaching” (Demir & Akbaş)
Discourse analysis, as Demir and Akbaş present it, begins from a deceptively simple question: what happens when we look beyond individual sentences and ask how language is actually used in context? The chapter opens by distinguishing between sentence-level grammar — the traditional domain of linguistic analysis — and discourse-level meaning, where context, speaker intent, and social situation become central. For language teachers, this shift in focus transforms how they think about communication and how they teach it. Language, the authors argue, isn’t just a system of forms; it’s a medium for interaction, shaped by cultural expectations, social relationships, and pragmatic goals.
The chapter first clarifies the term “discourse.”
Drawing on both linguistics and sociolinguistics, Demir and Akbaş define
discourse as language in use — stretches of spoken or written text that perform
communicative functions within specific contexts. The definition inherently
resists reduction to isolated sentences; it insists that meaning is constructed
through the interplay of linguistic forms and situational factors. For example,
the same grammatical structure (“Can you open the window?”) may function as a
question, a request, or even a polite command, depending on who is speaking, to
whom, and under what circumstances. Discourse, then, is not merely linguistic
material but a reflection of human interaction.
The authors contrast discourse analysis
with traditional linguistic analysis. While linguistics often treats
language as a system of abstract rules, discourse analysis examines how those
rules are mobilized in real situations. It looks for coherence and intention
rather than correctness. This distinction is particularly relevant for language
teaching, where a focus on grammar alone often produces learners who can form
correct sentences but struggle to use them naturally in conversation or
writing. Demir and Akbaş position discourse analysis as a corrective — a bridge
between form and function.
From this conceptual base, the chapter expands
on the scope of discourse analysis in language education. The authors
identify several layers at which discourse operates: textual (organization of
ideas), interactional (how participants manage turns, politeness, and repair),
and social (how identity and power relations shape communication). Each layer
offers insights teachers can use to help learners understand authentic language
use. For instance, teaching students about how spoken exchanges are structured
— greetings, small talk, topic shifts — equips them to engage more effectively
in real-world interactions.
Central to Demir and Akbaş’s argument is the role
of context. Context isn’t treated as a background variable but as an active
element in meaning-making. They distinguish between linguistic context (the
surrounding text), situational context (the immediate physical and social
environment), and cultural context (shared beliefs and norms that influence
interpretation). Each of these levels affects how discourse is understood. For
example, irony or humor often depends on cultural context, while reference
(“this,” “that,” “here,” “there”) depends on situational context. Without
attending to these layers, learners may misinterpret intended meanings even if
they understand the words.
The authors also discuss the relationship
between discourse and communicative competence. In communicative approaches
to language teaching, the goal is not merely grammatical accuracy but
appropriate use. Discourse analysis offers the descriptive and analytical tools
to achieve this. By examining real conversations, classroom interactions, or
written genres, teachers can help students see how meaning is negotiated, how
coherence is achieved, and how social roles are performed through language. For
example, understanding how academic articles establish authority or how service
encounters manage politeness directly informs pedagogical choices.
Demir and Akbaş then turn to types of
discourse relevant to teaching. They differentiate between spoken and
written discourse, monologic and dialogic forms, institutional and casual talk.
Each type reveals different conventions and constraints. For instance, spoken
discourse tends to feature hesitation, overlap, and repair, whereas written
discourse emphasizes cohesion and organization. Classroom activities should
expose learners to both, since language proficiency depends on navigating
across modes and contexts.
A key contribution of this chapter is its insistence on authenticity. Real language use, the authors argue, is often messy, nonlinear, and context-bound — unlike the clean examples found in many textbooks. Discourse analysis encourages teachers to bring authentic materials into the classroom: transcripts of conversations, online discussions, advertisements, academic essays, and more. By analyzing how language actually functions in such texts, learners can develop a deeper, more flexible understanding of meaning. The focus shifts from memorizing rules to recognizing patterns of use.
In addition to defining discourse analysis,
the chapter sketches its interdisciplinary foundations. The field draws
from linguistics, sociology, anthropology, and pragmatics. Scholars like
Foucault, Goffman, and Halliday have all contributed perspectives on how
discourse constructs reality, organizes social relations, and reflects power.
Demir and Akbaş note that while some approaches emphasize ideology and social
structure, others concentrate on micro-level interactions. For language
teaching, both levels matter: learners need to understand how discourse encodes
social norms as well as how it operates turn by turn.
Pedagogically, the authors suggest several
implications. First, language curricula should incorporate discourse-level
objectives — such as interpreting implicature, managing topic development, or
recognizing register differences. Second, teachers should train learners to
analyze discourse critically: not just what is said, but how and why it is said
in particular ways. For instance, students might examine how advertising
language constructs desire or how academic writing signals stance and
authority. Such tasks cultivate both linguistic awareness and critical
literacy.
Demir and Akbaş also warn against overgeneralization.
Discourse varies across cultures, institutions, and genres; therefore, teachers
must avoid imposing one communicative norm as universal. Instead, they should
encourage learners to compare discourse practices across languages and cultures
— a process that enhances intercultural competence. A Turkish learner of
English, for example, may discover that politeness strategies differ
significantly between the two languages, influencing how requests or refusals
are phrased.
The chapter concludes with a reflection on the
transformative potential of discourse analysis in language education. By
bringing attention to the interaction between language and context, it moves
teaching beyond mechanical drills toward meaningful communication. It empowers
learners to become analysts of language, capable of interpreting and producing
discourse appropriate to varied situations. For teachers, it offers a lens to
design materials that mirror authentic language use and promote deeper
engagement with meaning.
Ultimately, Demir and Akbaş present discourse
analysis not just as an academic framework but as a pedagogical mindset
— one that views language as social action. Understanding discourse means
understanding people: their intentions, relationships, and cultural worlds. For
language teaching, this understanding redefines what it means to “know” a
language. To be competent is not merely to know grammar and vocabulary, but to
navigate discourse — to interpret, infer, and interact meaningfully within
context.
Chapter 2: “Basic Concepts in Discourse Analysis” (Yastıbaş A.E.)
(Approx. 1,000 words)
Yastıbaş’s chapter lays the conceptual
foundation for anyone trying to understand or teach language through the lens
of discourse. Where Demir and Akbaş defined the scope of discourse analysis,
this chapter gives readers the vocabulary — both literal and theoretical — to
analyze how real communication works. It reads almost like a toolkit for
teachers and students: each concept explained not in isolation, but in relation
to how it appears in everyday interaction or classroom practice.
At the center of Yastıbaş’s approach is the
idea that language forms a system of connected meanings, not a
collection of independent sentences. To make sense of discourse, one needs to
grasp the mechanisms that hold language together — mechanisms such as cohesion,
coherence, register, genre, speech acts, and turn-taking.
Each term names a piece of the puzzle: how texts hang together, how meaning
flows, and how speakers coordinate to create understanding.
The chapter begins with cohesion, the
surface-level glue that binds a text. Yastıbaş draws on Halliday and Hasan’s
classic framework, identifying cohesive devices like reference (using pronouns
or demonstratives to link sentences), substitution (replacing repeated items
with a shorter expression), ellipsis (leaving out recoverable information),
conjunction (logical connectors such as “however,” “therefore”), and lexical
cohesion (repetition, synonymy, or collocation). Through examples, Yastıbaş
shows how cohesion creates textual unity: “Ali bought a new phone. It has a
great camera.” Without cohesive ties, discourse fragments into unrelated
statements. Teachers, therefore, can help learners trace these links, training
them to recognize how English achieves flow and connectedness.
But cohesion alone doesn’t guarantee
understanding. The next key concept, coherence, refers to the deeper
sense of logical and semantic unity. A text may be cohesive yet incoherent if
its ideas lack order or relevance. Yastıbaş illustrates this difference with
deliberately disjointed examples: grammatically well-formed sentences that make
no sense when read together. Coherence depends on shared knowledge,
expectations, and purpose — all contextual elements. For language teaching,
this distinction is crucial: learners must learn not only to form cohesive
sentences but to organize their ideas coherently, according to genre
conventions and communicative goals.
The discussion then moves to register —
the variety of language used in a particular situation. Register reflects
choices of vocabulary, syntax, and style appropriate to context. Yastıbaş uses
the tripartite model of field, tenor, and mode:
- Field
concerns what is happening (the subject matter or activity).
- Tenor
concerns who is involved and what their relationship is.
- Mode
concerns the channel of communication (spoken, written, digital, etc.).
These variables shape how people speak or write. For instance, the language of a scientific report differs from that of a casual conversation not because of grammar alone, but because of differing purposes and relationships. Teaching register awareness helps learners adapt language appropriately — formal for academic writing, relaxed for conversation — and avoid pragmatic missteps.
Closely related is genre, a concept
often misunderstood as simply “type of text.” Yastıbaş clarifies that genre
involves socially recognized communicative purposes and structural patterns.
Each genre — whether a recipe, a business email, or a research article —
carries predictable moves and rhetorical structures. Understanding genre equips
students to decode and produce texts that meet audience expectations. For
example, in English academic writing, an introduction typically establishes
context, states a thesis, and previews structure. In contrast, narratives build
tension through chronological events. By teaching genre, educators give
learners templates for organizing meaning, not just filling grammar slots.
After textual organization, Yastıbaş turns to speech
act theory, derived from Austin and Searle. Every utterance, she reminds
us, performs an action: requesting, apologizing, promising, refusing. Saying
“Could you pass the salt?” enacts a request, not a question about ability.
Speech acts demonstrate how meaning extends beyond literal form — a lesson
essential for second-language learners who might interpret direct translations
too literally. Yastıbaş classifies speech acts as locutionary (the
actual words spoken), illocutionary (the intended function), and perlocutionary
(the effect on the listener). For teaching, analyzing speech acts builds
pragmatic competence: students learn when and how to make requests, give
compliments, or decline invitations politely according to cultural norms.
From speech acts, the chapter expands into conversational
structure, particularly turn-taking and repair mechanisms.
Drawing on conversation analysis, Yastıbaş explains that spoken discourse is a
cooperative enterprise governed by implicit rules: participants take turns,
respond to cues, and manage overlaps. The turn-taking system is remarkably
efficient — one speaker yields, another enters, and interruptions are
negotiated. When breakdowns occur, speakers use repair strategies (e.g.,
“I mean…,” “sorry, what I meant was…”) to maintain mutual understanding.
Classroom exploration of real conversation transcripts helps students perceive
these dynamics, understand natural rhythm, and practice interactive listening.
Adjacency pairs —
predictable question–answer, greeting–greeting, or offer–accept sequences —
further illustrate how spoken discourse builds coherence socially, not just
grammatically. Such micro-structures show learners that conversation is
patterned, not random. Recognizing these patterns makes it easier to
participate in dialogue and anticipate appropriate responses.
Another key topic is deixis — words
whose meaning depends on context (e.g., “here,” “there,” “this,” “now,” “you”).
Yastıbaş emphasizes that deixis reveals the embeddedness of language in
situation: pronouns and adverbs point to the physical, temporal, or social
coordinates of speech. Non-native learners often find deictic expressions
tricky because their reference shifts with perspective. Teachers can highlight
this context dependence by analyzing short dialogues or narratives, asking
students to identify who “I” and “you” refer to and where “here” actually is.
Throughout the chapter, Yastıbaş constantly
loops theory back to pedagogical application. Discourse analysis, she
argues, is not an abstract pursuit but a practical lens for improving teaching
materials and classroom activities. For instance, rather than drilling isolated
sentences, teachers can design tasks that highlight cohesive devices, organize
coherent paragraphs, or simulate authentic speech events. By engaging students
in analyzing actual texts — interviews, newspaper articles, or classroom
exchanges — they internalize how discourse works in the wild.
The author also touches on pragmatics,
the broader study of how context influences meaning. Pragmatic awareness
overlaps with discourse analysis but focuses more narrowly on speaker intention
and listener inference. Yastıbaş sees pragmatics and discourse analysis as
complementary: the former explains how meaning is implied, while the latter
explains how it is structured and maintained across larger stretches of
communication. For teaching, both are essential to move learners beyond literal
comprehension toward nuanced interpretation.
The chapter underscores the interdependence
of these concepts. Coherence depends on cohesion, but also on genre and
register. Speech acts unfold within turn-taking structures, influenced by
cultural norms. Deictic references anchor coherence in time and space.
Together, these features create the fabric of discourse — a fabric teachers
must help learners navigate consciously.
Yastıbaş concludes by highlighting the teacher’s
role as discourse facilitator. Teachers should model discourse strategies,
scaffold student interactions, and encourage reflection on how meaning is
constructed. Classroom discussions about why a conversation “feels awkward” or
why a paragraph “doesn’t flow” are in fact moments of discourse analysis. By
giving students the terminology and awareness to articulate such observations,
teachers foster metalinguistic competence — the ability to think about language
as a system of choices and effects.
In summary, this chapter transforms abstract
linguistic ideas into practical analytical tools. It builds a foundation for
the later chapters on spoken discourse, vocabulary, and corpus use by grounding
readers in the essential components that make communication meaningful. For
language teachers, mastering these concepts means gaining the insight to
diagnose learner errors at the level of discourse — not just grammar — and to
design instruction that mirrors real communicative practice.
Chapter 3: “Spoken Language Analysis” (Girgin U. & Acar Y.)
(Approx. 1,000 words)
Girgin and Acar’s chapter shifts the
analytical lens from general discourse theory to the living, breathing reality
of spoken language—the form of discourse most learners encounter first,
yet the one most often simplified or ignored in classrooms. Where written texts
can be planned, revised, and polished, spoken discourse unfolds spontaneously,
moment by moment. The authors argue that to teach language effectively, we must
understand how real conversation works: its rhythm, its unpredictability, and
its social meanings.
They open by noting that spoken language is
not a defective version of writing. It follows its own grammar, logic, and
conventions. Many textbooks still treat speech as merely “informal writing”
filled with errors or redundancies, but Girgin and Acar insist this is a
mistake. Spoken discourse obeys different principles—those of interaction and
co-construction. Two or more speakers jointly build meaning in real time,
drawing on shared knowledge, gestures, intonation, and immediate context. Every
pause, overlap, and repair serves a purpose within this cooperative act.
Distinctive Features of Spoken Discourse
The chapter carefully identifies features that
distinguish spoken language from written text:
- Spontaneity and planning –
Speakers rarely plan an entire utterance before speaking. They formulate
ideas as they talk, resulting in false starts, hesitations,
repetitions, fillers (“uh,” “you know”), and syntactic adjustments.
These are not signs of weakness but strategies for keeping the
conversational floor while thinking ahead.
- Turn-taking –
Conversation depends on an intricate system of taking turns. One
participant speaks, another responds, often within fractions of a second.
Turn boundaries are cued by intonation, gaze, pauses, or syntactic
completion. Girgin and Acar draw on conversation analysis (CA) to show how
this system works remarkably smoothly, minimizing silence and overlap.
When overlaps occur, speakers resolve them through short pauses, yielding,
or simultaneous completion—behaviors that students can practice and
analyze through recorded dialogues.
- Adjacency pairs –
Talk unfolds through paired actions: question-answer, greeting-greeting,
offer-acceptance, compliment-response. These pairs build predictability
and coherence. Recognizing them helps learners anticipate responses and
understand conversational flow.
- Repair and self-correction –
Communication is full of small breakdowns. Speakers correct themselves (“I
mean…,” “sorry, what I wanted to say is…”), clarify misunderstandings, or
prompt clarification from others. Far from being embarrassing, repair
mechanisms are vital tools for sustaining interaction. The authors
highlight that teaching these strategies boosts learners’ confidence and
keeps conversations moving despite errors.
- Back-channeling –
Short listener responses (“yeah,” “uh-huh,” “really?”) signal engagement
and understanding. Their absence may be perceived as disinterest. Girgin
and Acar suggest that teachers explicitly model such listener cues so
learners grasp their pragmatic importance.
- Prosody and intonation –
Meaning in speech is carried as much by tone, pitch, and rhythm as by
words. Rising intonation can signal a question, disbelief, or politeness;
stress can highlight contrast. Exercises that include imitation,
shadowing, and analysis of intonation contours can train learners to
perceive and produce these subtleties.
Interactional Meaning
The authors argue that spoken discourse is
interactional rather than transactional. Written communication typically
aims to transmit information, but spoken interaction also builds relationships,
manages identity, and negotiates solidarity. Small talk, for example, may
convey very little factual content yet performs heavy social work—establishing
rapport, politeness, and belonging. Teachers who dismiss such exchanges as
trivial miss an opportunity to develop students’ pragmatic and sociolinguistic
competence.
Girgin and Acar illustrate this point with
examples from real conversation transcripts. When two colleagues exchange
“How’s it going?” “Not bad, you?” the words themselves are formulaic, but the
underlying function is relational. Recognizing this difference between what
is said and what is done through saying it aligns with the
speech-act perspective introduced earlier in Yastıbaş’s chapter.
Analyzing Spoken Data
The chapter then turns to methodology—how
analysts study spoken discourse. Transcription becomes a critical tool. Unlike
written text, conversation must be represented with symbols capturing pauses,
overlaps, elongations, and intonation. Girgin and Acar introduce standard
transcription conventions (such as those from Jefferson’s system) and argue
that teachers can adapt simplified versions for classroom use. Having students
transcribe short audio clips helps them notice real-world patterns of
interaction they’d otherwise overlook.
From transcription, analysis moves to
identifying patterns of turn organization, topic management, and discourse
markers. Expressions like “you know,” “I mean,” or “well” often serve functions
unrelated to their literal meaning—they signal transition, mitigate
disagreement, or buy time. Girgin and Acar encourage teachers to treat such
markers not as filler words to eliminate but as interactional tools to
master.
Pedagogical Implications
A major section of the chapter bridges theory
and classroom practice. The authors argue that traditional language instruction
privileges written norms—complete sentences, polished vocabulary, and formal
style—while treating speech as secondary. Yet learners primarily need spoken
competence to navigate real life. Discourse analysis provides a way to make
speaking instruction more authentic.
They propose several teaching strategies:
- Using authentic recordings –
Real conversations, interviews, podcasts, and classroom interactions can
replace scripted dialogues. Teachers can guide learners to notice natural
features: pauses, interruptions, back-channels, and repairs.
- Role-plays and simulations –
Activities should mirror real communicative situations: making
appointments, resolving misunderstandings, or negotiating opinions. During
feedback, teachers can highlight discourse-level elements rather than
grammatical errors alone.
- Awareness-raising tasks –
Students might compare a textbook dialogue with a real conversation on the
same topic, identifying what makes the latter sound more natural.
- Intonation practice –
Listening to and imitating different intonation patterns builds awareness
of pragmatic meaning: rising tones for uncertainty, falling tones for
completion, stress for contrast.
Girgin and Acar emphasize that focusing on
spoken discourse also nurtures listening skills. When students
understand the structures of conversation—how turns are organized, how repairs
occur—they become more effective listeners. They learn to predict what comes
next, interpret partial utterances, and tolerate ambiguity.
Cultural and Contextual Dimensions
The chapter broadens the discussion to cross-cultural
variation in spoken discourse. Turn-taking norms, silence tolerance,
politeness conventions, and feedback behaviors differ widely across cultures.
In some languages, overlap signals enthusiasm; in others, it may seem rude.
Turkish and English, for instance, differ in how much silence is comfortable
between turns. Teachers should therefore raise learners’ awareness of such
variation to prevent pragmatic miscommunication.
Girgin and Acar advocate incorporating intercultural
pragmatics into spoken-discourse teaching. Activities can involve comparing
recordings from different cultural contexts or analyzing how the same speech
act—say, making a request—is realized differently in English and the learners’
L1. This approach fosters flexibility and intercultural sensitivity.
Spoken vs. Written Contrast
To clarify the pedagogical implications
further, the authors contrast spoken and written discourse along several
dimensions:
Feature |
Spoken |
Written |
Planning |
Spontaneous, real-time |
Planned, revised |
Structure |
Looser, repetitive |
Organized, hierarchical |
Vocabulary |
Everyday, vague |
Precise, formal |
Grammar |
Fragmented clauses |
Complete sentences |
Cohesion |
Intonation, repetition, deixis |
Conjunctions, punctuation |
Interaction |
Immediate feedback |
Delayed or none |
Understanding these differences allows
teachers to design balanced curricula. Learners should gain competence in both
modes—able to write coherently and speak naturally—rather than applying written
norms to speech.
Challenges and Opportunities
Girgin and Acar acknowledge practical
challenges: recording and analyzing spoken data can be time-consuming;
authentic materials may include dialects or slang that intimidate learners. Yet
these difficulties are outweighed by the benefits. Real speech reflects how
language actually functions. By exposing students to such material, teachers
promote adaptability and confidence.
The authors also discuss the growing role of technology—corpus
tools, speech recognition, and transcription software—that make spoken
discourse accessible for classroom analysis. Teachers can use online corpora of
spoken English to explore frequency of discourse markers or the structure of
service encounters.
Conclusion
The chapter concludes by reaffirming that spoken
language analysis redefines what it means to know a language. Competence is
not only grammatical correctness but the ability to manage turns, signal
engagement, and negotiate meaning on the fly. Teaching through discourse
analysis helps learners internalize these skills by observing how real speakers
construct conversation collaboratively.
Girgin and Acar’s message is clear: speech is
not the informal cousin of writing but the primary mode of human communication,
rich with structure, strategy, and cultural nuance. By analyzing spoken
discourse—its patterns of turn-taking, repair, and prosody—teachers can make
language learning more authentic, interactive, and socially grounded.
Chapter 4: “Discourse Analysis and Vocabulary” (Yastıbaş A.E.)
(Approx. 1,000 words)
Yastıbaş’s “Discourse Analysis and Vocabulary”
chapter challenges one of the most persistent misconceptions in language
teaching — that vocabulary learning is simply a matter of memorizing individual
words and their dictionary meanings. She argues that words do not live in
isolation; they live in discourse, and their meanings emerge through
patterns of use, collocation, and context. To understand vocabulary deeply,
learners must see how words function within authentic stretches of language —
how they interact with other words, signal stance, and adapt to social
situations. This chapter therefore reframes vocabulary teaching from a static,
list-based exercise into a dynamic process of contextual and functional
understanding.
The Nature of Vocabulary in Context
The chapter begins with a critique of
traditional approaches that treat vocabulary as discrete units detached from
meaning in use. Yastıbaş highlights how such methods, though efficient for rote
learning, often leave learners unable to choose words appropriately in real
communication. A learner may know the word request but fail to realize
that in casual speech ask for is more natural. The key insight is that
word choice always depends on context — linguistic, situational, and
cultural.
In discourse analysis, vocabulary is viewed as
part of a system of relationships. A word’s meaning is partly defined by its
collocates (the words it tends to appear with) and by its role within larger
patterns of coherence and register. For example, take a photo, make a
mistake, do homework — such combinations reveal the habitual ways
speakers organize meaning. Understanding these typical pairings gives learners
insight into natural usage far beyond dictionary definitions.
Yastıbaş emphasizes the concept of lexical
patterning — the tendency of language to repeat certain structures across
contexts. She illustrates how lexical choices reveal stance, attitude, and
interpersonal meaning. The difference between He died and He passed
away is not grammatical but social; it indexes politeness, empathy, and
formality. Discourse analysis helps uncover such layers of meaning and helps
teachers show students that vocabulary is a window into culture and pragmatics.
Cohesion and Lexical Relations
Building on earlier chapters, Yastıbaş
connects vocabulary to cohesion. Words contribute to textual unity
through repetition, synonymy, antonymy, hyponymy, and collocation. These
cohesive ties create a sense of continuity that guides the reader or listener
through discourse. Consider a short paragraph about climate: “The temperature
is rising. These changes in weather patterns are affecting
agriculture.” The repetition of semantically related words forms a network of
meaning. Teaching learners to notice these links develops both vocabulary depth
and discourse awareness.
Another crucial concept is semantic prosody
— the connotative aura that certain words acquire from their typical contexts.
For instance, cause often collocates with negative outcomes (cause
trouble, cause damage), while provide frequently appears with
positive ones (provide support, provide help). Learners who miss these
subtle associations may produce grammatically correct but pragmatically awkward
sentences (cause happiness sounds odd, though possible). Yastıbaş
suggests that discourse-based vocabulary instruction should involve examining
authentic corpora or texts to explore how words “behave” in real usage.
Register and Vocabulary Choice
Vocabulary also reflects register — the
social and situational variation in language use. The author draws on
Halliday’s field-tenor-mode model again to show how different registers call
for different lexical choices. Scientific writing favors precision and
nominalization (an investigation of factors affecting growth), while
casual conversation relies on verbs and colloquial phrases (we looked into
what makes plants grow). Learners must therefore not only know words but
also know when and where to use them.
Yastıbaş provides examples of lexical
variation across registers:
- Formal: purchase, residence, commence
- Neutral: buy, home, start
- Informal: grab, place, kick off
Such distinctions cannot be learned through
isolated study; they require contextual exposure. Teachers can present multiple
versions of the same message in different registers, asking students to analyze
how lexical choice signals relationship, tone, and purpose.
Vocabulary in Spoken vs. Written Discourse
The chapter also differentiates between spoken
and written vocabulary. Spoken language tends to use high-frequency,
general words and formulaic expressions (you know, kind of, stuff like that),
while written texts use more varied and specific vocabulary. Yastıbaş warns
teachers against dismissing spoken vocabulary as “simple.” In fact, spoken
discourse relies on multiword units or chunks — phrases that
function as single units of meaning (at the end of the day, to be
honest, you see). Teaching these chunks helps learners sound more
fluent and natural.
Conversely, written discourse values lexical
density — the concentration of content words per clause. Academic writing, for
instance, compresses meaning through nominalization and technical terms.
Learners who understand how vocabulary operates differently across modes can
adjust their language appropriately depending on purpose and audience.
Lexical Cohesion and Thematic Development
Yastıbaş highlights how vocabulary choices
shape the flow of ideas across paragraphs or turns in conversation. Through lexical
cohesion, writers and speakers establish thematic continuity. She provides
examples showing how repeated or semantically related vocabulary creates a
thread that ties sentences together:
“The government announced a new education
policy. The plan aims to improve schools and teachers’
training. Supporters believe the reforms will modernize the system.”
Here, the lexical set (policy, plan,
schools, training, reforms, system) reinforces the same semantic field,
ensuring coherence. Recognizing such patterns helps learners both comprehend
and produce well-structured texts.
Teaching Vocabulary through Discourse Analysis
Moving from theory to practice, Yastıbaş
outlines several pedagogical applications. First, teachers can design context-based
vocabulary tasks that focus on how words operate in real discourse. For
example, rather than teaching synonyms in isolation, students can examine their
usage in authentic sentences: how say differs from tell, speak,
or talk depending on syntactic and pragmatic context.
Second, she encourages the use of corpus
tools and concordance lines to observe how words behave across large
collections of texts. By examining hundreds of real examples, learners discover
common collocations, grammatical patterns, and register tendencies. Even
without advanced software, teachers can simulate this by collecting short
authentic texts and having students mark recurring lexical patterns.
Third, teachers can integrate genre-based
instruction, highlighting how vocabulary choices support the purpose of
each genre. A news report favors factual, objective vocabulary (report,
confirm, announce), while an opinion piece uses evaluative language (argue,
claim, criticize). Classroom tasks might include rewriting a neutral
paragraph in persuasive style to see how vocabulary changes tone and stance.
Yastıbaş stresses that vocabulary learning
must engage both breadth (the number of words known) and depth
(how well they are known). Discourse analysis enhances depth by uncovering
subtle associations, typical patterns, and pragmatic effects. It teaches
learners to look beyond translation equivalents toward how words “behave” in
different contexts.
Vocabulary and Meaning Negotiation in Interaction
In spoken interaction, vocabulary also plays a
role in negotiating meaning. Learners often compensate for lexical gaps
by paraphrasing or using approximate expressions. Yastıbaş notes that such
strategies reflect discourse competence: speakers manage communication through
reformulation, clarification, and confirmation. Teachers can exploit this by
designing communicative tasks where vocabulary learning emerges naturally
through use — for instance, information-gap activities that require learners to
explain, describe, or negotiate unknown items.
Critical and Cultural Perspectives
The chapter concludes with a broader
reflection on how vocabulary encodes cultural and ideological meaning.
Words carry assumptions shaped by social context — for example, how “freedom,”
“family,” or “success” vary across cultures. Discourse analysis invites
learners to explore how vocabulary choices construct certain worldviews and
silence others. Critical awareness of lexical framing — in media, advertising,
or politics — equips learners to read texts not just for information but for
perspective.
Yastıbaş argues that teaching vocabulary
through discourse analysis aligns with the communicative and critical aims
of modern language education: developing learners who can interpret, evaluate,
and use language effectively in real situations. It transforms vocabulary study
from memorization into discovery — the discovery of how language reflects and
shapes social reality.
Conclusion
Ultimately, Yastıbaş’s chapter demonstrates
that vocabulary cannot be divorced from discourse. Every word is embedded in
patterns of usage, relationships, and cultural meanings. Teaching vocabulary
through discourse analysis means teaching students to see words as choices —
choices that depend on who is speaking, to whom, in what context, and for what
purpose. This approach not only builds richer lexical knowledge but also
nurtures pragmatic and intercultural competence. Learners trained to read and
listen for these patterns become more discerning and adaptable users of
language — capable of understanding not just what words mean, but how
they mean.
Chapter 5: “Corpus Linguistics Perspective for Discourse Analysis and
Language Teaching” (Bal Gezegin & Akbaş)
(Approx. 1,000 words)
Bal Gezegin and Akbaş’s chapter brings
together two powerful strands of modern linguistics — discourse analysis
and corpus linguistics — to show how technology and data-driven
approaches can reshape how we study and teach language. While earlier chapters
emphasized context, interaction, and meaning, this one provides the empirical
tools to explore those phenomena systematically. A corpus, as they
define it, is a large, electronically stored collection of authentic spoken or
written texts compiled for linguistic study. Corpus linguistics, therefore, is
both a methodology and a perspective: it examines how language is used “in the
wild,” identifying patterns, frequency, and collocations that reveal how
discourse actually operates.
From Intuition to Evidence
The chapter opens by contrasting intuitive
and empirical approaches to language. Traditional grammar teaching often
relies on teacher intuition or prescriptive rules that may not reflect real
usage. Corpus linguistics replaces this intuition with evidence drawn from
millions of words of authentic data. Instead of asking, “Is this sentence
correct?” teachers and learners can ask, “How do proficient speakers actually
use this phrase?” This shift from authority to evidence is transformative for
discourse analysis and language pedagogy alike.
Bal Gezegin and Akbaş trace the origins of
corpus linguistics to early projects like the Brown Corpus and later
advances such as the British National Corpus (BNC) and COCA (Corpus
of Contemporary American English). These databases make it possible to
search for linguistic patterns across vast amounts of text — something
impossible by intuition alone. For discourse analysts, such corpora provide a
panoramic view of how cohesion, register, and lexical choices vary across
genres and contexts.
Corpus Linguistics and Discourse Analysis: A Natural Alliance
The authors argue that discourse analysis and
corpus linguistics share the same goal: understanding language in use.
Discourse analysis traditionally involves close qualitative reading of
individual texts or interactions, while corpus linguistics provides
quantitative breadth — revealing how common certain patterns are across many
instances. Together, they offer both depth and scope. For example, a discourse
analyst might observe that politicians often use inclusive pronouns (“we,”
“our”) to build solidarity; corpus evidence can confirm how frequent this
pattern is across thousands of speeches.
This alliance enriches discourse analysis in
several ways:
- Pattern discovery:
Corpus tools uncover recurring collocations, discourse markers, and
phraseological frames.
- Frequency analysis:
Quantitative counts reveal which linguistic features are typical of
certain genres or registers.
- Comparative analysis:
Researchers can compare how the same concept is expressed in different
contexts (e.g., academic vs. conversational English).
- Authenticity:
Corpora provide genuine language data rather than textbook examples,
aligning perfectly with the emphasis on authentic discourse found
throughout the book.
Core Concepts: Collocation, Concordance, and Keyness
The heart of corpus-based analysis lies in
identifying patterns of co-occurrence. Bal Gezegin and Akbaş introduce
three central techniques:
- Collocation
refers to the habitual co-occurrence of words (e.g., strong coffee
vs. powerful coffee). These pairings show how meaning is shaped by
convention rather than rule.
- Concordance
displays every occurrence of a word within its immediate context, allowing
analysts to observe usage patterns and semantic prosody. For instance,
examining the word commit across hundreds of lines reveals its
typical collocates (crime, suicide, sin), helping learners infer
its negative connotation.
- Keyness analysis
compares corpora to find words that occur unusually frequently in one
dataset relative to another. This highlights what characterizes a
particular discourse — for example, how environmental activism texts use planet,
sustainability, and future more often than general news
articles.
Each technique, the authors note, connects
directly to classroom application: teachers can use simple concordance searches
to show students how words behave across contexts, encouraging independent
discovery of meaning and usage.
Corpus Tools in Language Teaching
Bal Gezegin and Akbaş advocate for a data-driven
learning (DDL) approach, where learners explore corpus data themselves
rather than receiving pre-packaged explanations. The process encourages active
noticing and hypothesis testing — core principles of discourse analysis. In a
typical DDL task, students might be given concordance lines for the phrase in
order to and asked to infer its grammatical and pragmatic function. This
analytical process mirrors how linguists study discourse and helps learners
internalize authentic patterns.
The authors also present practical classroom
tools:
- AntConc — a
free concordancer allowing teachers to search their own mini-corpora of
student writing or authentic texts.
- COCA and BNCweb
— online corpora with user-friendly interfaces that allow frequency
searches and collocation analyses.
- Sketch Engine — a
more advanced platform offering “word sketches” summarizing typical
collocations and grammatical relations.
Teachers can use these resources to design
discovery activities: for example, comparing say vs. tell,
exploring hedging devices (sort of, kind of, maybe), or analyzing the
vocabulary of opinion essays. By connecting quantitative data with qualitative
interpretation, corpus methods make discourse analysis concrete and verifiable.
Corpora and Register Variation
The chapter then explores how corpora
illuminate register and genre variation, a key theme in discourse
analysis. Different types of discourse — academic writing, conversation,
journalism, fiction — exhibit distinct lexical and grammatical tendencies.
Corpus data quantifies these differences. For instance, conversational English shows
higher frequencies of first-person pronouns and contractions, while academic
English favors nominalization and prepositional phrases.
For teachers, such findings have immediate
pedagogical value. They clarify what features learners should focus on when
moving between spoken and written modes. A corpus-informed syllabus, Bal
Gezegin and Akbaş argue, can prioritize high-frequency structures that learners
are most likely to encounter in real communication, rather than rare or
artificial textbook examples.
Advantages of a Corpus Approach
The authors list several advantages of
integrating corpus linguistics into discourse-based teaching:
- Authenticity:
Corpora represent real language use, bridging the gap between classroom
input and real-world communication.
- Objectivity:
Teachers can base explanations on evidence, not intuition, reducing
subjectivity in error correction and feedback.
- Autonomy:
Students become researchers, discovering usage patterns independently.
- Richness of input:
Exposure to diverse contexts broadens learners’ pragmatic and lexical
awareness.
- Integration with technology:
Corpora encourage digital literacy and critical thinking about language.
Challenges and Limitations
Bal Gezegin and Akbaş also acknowledge the challenges
of corpus-based pedagogy. Access to technology, lack of training, and limited
time in curricula may hinder implementation. Moreover, corpus tools can be
intimidating for beginners. The authors recommend starting small — using short,
focused corpora or pre-prepared concordance lines rather than overwhelming
students with raw data. Teachers can gradually build learners’ confidence,
guiding them from observation to interpretation.
Another limitation concerns contextual
depth. Corpus data shows patterns, but it cannot fully explain why they
occur. Quantitative frequency must always be paired with qualitative
interpretation — precisely where discourse analysis complements corpus work.
The authors stress that the two approaches are not alternatives but partners:
corpus linguistics provides evidence; discourse analysis supplies explanation.
Applications to Discourse and Vocabulary Teaching
Connecting back to Yastıbaş’s previous
chapter, the authors show how corpus analysis enhances vocabulary instruction.
By examining collocation and semantic prosody, learners develop a more nuanced
sense of word meaning. For instance, a corpus search for issue reveals
its use in both neutral (discuss an issue) and negative (address
serious issues) contexts, helping learners grasp its pragmatic range.
Similarly, examining verbs like make, do, and take in
context clarifies idiomatic usage.
In writing instruction, corpus tools allow
students to analyze model essays or their own drafts. They can compare
frequency of connectors (however, therefore, moreover) or hedges (probably,
seems, may) with expert texts, gaining insight into stylistic norms. Spoken
corpora likewise support pronunciation and discourse-marker analysis, revealing
how expressions like you know or I mean function pragmatically in
conversation.
Corpus-Informed Teacher Development
Beyond classroom practice, Bal Gezegin and
Akbaş argue that corpora are invaluable for teacher training and materials
design. Teachers can use corpora to verify textbook content, identify
authentic examples, or select high-frequency vocabulary for syllabus design.
For instance, research shows that the 2,000 most frequent word families cover
around 80% of everyday English usage — data that directly informs curriculum
planning.
Corpus-informed teacher education also fosters
critical awareness. Teachers trained to interpret corpus evidence become
reflective practitioners who question linguistic myths (“native speakers never
say X,” “formal English never uses contractions”). Instead, they base their
instruction on real data, aligning pedagogy with actual language use.
Conclusion: A Data-Driven Future for Discourse Teaching
The chapter concludes by reaffirming that
corpus linguistics revolutionizes how discourse is studied and taught. It
grounds discourse analysis in empirical reality, allowing teachers and learners
to observe patterns of cohesion, register, and vocabulary across authentic
texts. When combined with discourse analysis, corpus methods bridge micro-level
detail (individual word use) and macro-level meaning (genre, ideology, social
context).
For language teaching, the message is clear:
corpus-informed pedagogy empowers learners to become analysts of their own
input. It transforms the classroom from a place of rule transmission into a
laboratory of discovery. Learners do not just memorize forms; they explore how
those forms function in real communication.
Bal Gezegin and Akbaş’s contribution thus
extends the book’s central theme — that language must be understood in context
— into the realm of digital analysis. By showing how technology can capture and
quantify authentic discourse, they offer educators powerful tools to make
teaching evidence-based, exploratory, and genuinely aligned with how people use
language. Corpus linguistics, in their vision, is not a separate field but the empirical
backbone of discourse analysis — and, by extension, of any language
teaching that seeks to reflect real communication rather than prescriptive
tradition.
No comments:
Post a Comment
Thanks for your comment...I am looking forward your next visit..