Human Generated Data

Title

Untitled (woman holding camera on street)

Date

c. 1950

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15738

Human Generated Data

Title

Untitled (woman holding camera on street)

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15738

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 98.9
Human 98.9
Person 98
Person 97.9
Person 97.8
Clothing 91
Apparel 91
Person 90.5
Art 82.6
Advertisement 82.5
Text 73.2
Suit 70
Coat 70
Overcoat 70
Photography 66.5
Photo 66.5
Drawing 65.9
Female 64.6
Portrait 63.2
Face 63.2
Collage 60.2
Poster 59.2

Clarifai
created on 2023-10-29

people 100
group 99.6
interaction 98.1
adult 98
man 98
woman 96.3
child 96.1
administration 94.9
leader 94.4
music 94.2
outfit 92.9
nostalgia 92.6
several 91.8
family 91.1
three 90.6
war 90.1
group together 89.5
nostalgic 89.2
wear 88.9
military 88.8

Imagga
created on 2022-02-05

book jacket 36.2
newspaper 29.6
jacket 29.1
daily 26.6
blackboard 23.7
product 23
wrapping 21.4
vintage 19
creation 18.1
man 17.5
old 17.4
art 15.6
covering 15.3
people 12.8
painter 12.2
negative 12.2
ancient 12.1
office 12
black 12
film 11.7
antique 11.2
letter 11
business 10.9
symbol 10.8
comic book 10.5
money 10.2
envelope 10.2
male 9.9
currency 9.9
postmark 9.8
retro 9.8
stamp 9.7
mail 9.6
post 9.5
wall 9.4
window 9.4
grunge 9.4
banking 9.2
portrait 9.1
paint 9
aged 9
one 9
shows 8.9
printed 8.8
postage 8.8
home 8.8
person 8.7
culture 8.5
design 8.4
house 8.4
dollar 8.3
circa 7.9
postal 7.8
paper 7.8
artist 7.7
card 7.6
sculpture 7.6
bill 7.6
power 7.5
bank 7.2
room 7.1
icon 7.1
drawing 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.5
drawing 98.3
person 96.8
sketch 96.2
window 92.2
clothing 89.1
old 78.2
man 67.1
cartoon 56.7
posing 45.8
vintage 27

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 89.5%
Happy 48.3%
Disgusted 37.8%
Calm 7.2%
Angry 3.7%
Surprised 1.4%
Sad 0.9%
Fear 0.5%
Confused 0.3%

AWS Rekognition

Age 48-56
Gender Male, 99.9%
Calm 93.5%
Sad 2.2%
Happy 1.9%
Angry 0.7%
Confused 0.6%
Surprised 0.5%
Disgusted 0.4%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Suit
Poster
Person 98.9%
Person 98%
Person 97.9%
Person 97.8%
Person 90.5%
Suit 70%
Poster 59.2%

Categories

Text analysis

Amazon

ENGLAND
ENGLAND I9
I9

Google

ENGLAND IS
ENGLAND
IS