Human Generated Data

Title

Untitled (people in theater audience watching show; three boys in front)

Date

1954

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14315

Human Generated Data

Title

Untitled (people in theater audience watching show; three boys in front)

People

Artist: Jack Gould, American

Date

1954

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Interior Design 99.9
Indoors 99.9
Furniture 99.9
Audience 99.4
Human 99.4
Crowd 99.4
Person 99.2
Person 99.1
Person 98.7
Person 98.1
Person 97.5
Person 96.5
Person 96.1
Chair 95.5
Person 94.4
Room 91.4
Clothing 89
Apparel 89
Accessory 84.5
Accessories 84.5
Sunglasses 84.5
Chair 84.3
Person 82.1
Sitting 81.3
People 69.9
Head 66
Face 63
Portrait 61.8
Photography 61.8
Photo 61.8
Overcoat 59.8
Suit 59.8
Coat 59.8
Female 59.6
Couch 58
Amusement Park 56.1
Theme Park 56.1
Person 47.1

Imagga
created on 2022-01-29

brass 32.9
cap 25
wind instrument 24.7
clothing 24.3
headdress 23.9
bathing cap 21
man 18.1
person 18.1
people 17.8
musical instrument 16.8
adult 15
happy 13.8
male 13.4
happiness 11.7
shower cap 11.5
human 10.5
business 10.3
sitting 10.3
chair 10
portrait 9.7
sexy 9.6
covering 9.6
lifestyle 9.4
laptop 9.3
communication 9.2
fashion 9
consumer goods 9
body 8.8
holiday 8.6
casual 8.5
attractive 8.4
glass 8.3
cheerful 8.1
helmet 8
handsome 8
celebration 8
working 7.9
model 7.8
men 7.7
outdoor 7.6
health 7.6
art 7.5
symbol 7.4
water 7.3
lady 7.3
suit 7.2
looking 7.2
smile 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

person 97.3
text 89.8
group 75.2
black and white 70.8
clothing 62.5
posing 38.5

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Female, 99.7%
Happy 99.8%
Calm 0.1%
Confused 0%
Disgusted 0%
Surprised 0%
Angry 0%
Sad 0%
Fear 0%

AWS Rekognition

Age 26-36
Gender Female, 95.4%
Calm 87.8%
Happy 6.9%
Surprised 1.4%
Disgusted 1.2%
Sad 1.2%
Angry 1%
Confused 0.3%
Fear 0.3%

AWS Rekognition

Age 26-36
Gender Female, 99.9%
Sad 57.9%
Calm 40%
Surprised 0.7%
Angry 0.4%
Fear 0.3%
Disgusted 0.2%
Confused 0.2%
Happy 0.2%

AWS Rekognition

Age 16-22
Gender Female, 61.2%
Sad 90.8%
Calm 3.5%
Confused 1.6%
Fear 1.2%
Happy 1%
Surprised 0.8%
Angry 0.6%
Disgusted 0.5%

AWS Rekognition

Age 24-34
Gender Female, 99%
Sad 69.8%
Calm 15.2%
Happy 5.6%
Confused 4.3%
Fear 1.6%
Surprised 1.4%
Angry 1.4%
Disgusted 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Chair 95.5%
Sunglasses 84.5%

Captions

Microsoft

a group of people posing for a photo 96.9%
a group of people posing for the camera 96.8%
a group of people sitting posing for the camera 96.7%

Text analysis

Amazon

STA
KODAK-E.VEELA-EITW