Skip to content

Pad token #202

Open
Open
@AyeshaSarwar

Description

@AyeshaSarwar

can you explain how you handle the pad token
since in batches, we have to pad caption so that all have the same length
then we feed that padded caption into the LSTM right?
then how and where did you ignore the pad token kindly if anyone explains
thanks

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions