Human Generated Data

Title

Untitled (band playing on mobile stage in field - parking lot)

Date

c. 1975

People

Artist: Ken Whitmire Associates, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1383

Human Generated Data

Title

Untitled (band playing on mobile stage in field - parking lot)

People

Artist: Ken Whitmire Associates, American

Date

c. 1975

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Musician 99.8
Human 99.8
Musical Instrument 99.8
Person 99.3
Stage 98.8
Guitar 98.7
Leisure Activities 98.7
Person 98.6
Wheel 98
Machine 98
Guitar 97.6
Person 94.5
Music Band 91.5
Guitarist 87.3
Performer 87.3
Drum 73.8
Percussion 73.8
Drummer 56.2

Imagga
created on 2022-01-22

musical instrument 35.1
percussion instrument 25.9
sky 17.2
steel drum 17.1
stall 16.1
building 14
accordion 13.9
keyboard instrument 13.3
landscape 12.6
city 12.5
travel 12
old 11.8
sea 11.7
architecture 11.7
people 11.7
house 11.7
outdoor 11.5
vehicle 11.1
beach 11
industrial 10.9
water 10.7
stage 10.2
ocean 10
wind instrument 9.8
vacation 9.8
mountain 9.8
car 9.5
platform 9.2
stone 9.1
marimba 9
summer 9
sand 8.7
rock 8.7
industry 8.5
clouds 8.4
seller 8.2
transportation 8.1
history 8
farm 8
holiday 7.9
urban 7.9
sunny 7.7
seascape 7.6
ashcan 7.6
freight car 7.6
field 7.5
hill 7.5
resort 7.5
tourism 7.4
man 7.4
dirty 7.2
sunset 7.2
coast 7.2
wheeled vehicle 7.1
night 7.1
male 7.1
work 7.1
bin 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.3
black 81
person 79.7
black and white 76.8
white 67.4
old 50.7

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Female, 76.1%
Calm 41.4%
Disgusted 25.1%
Fear 11.5%
Angry 7.7%
Sad 5%
Confused 4.9%
Surprised 2.9%
Happy 1.4%

AWS Rekognition

Age 34-42
Gender Male, 100%
Calm 44.6%
Disgusted 25%
Fear 19.1%
Surprised 3.8%
Happy 2.9%
Confused 2.8%
Sad 1.1%
Angry 0.7%

AWS Rekognition

Age 22-30
Gender Male, 99.4%
Sad 51.4%
Calm 28%
Disgusted 7.3%
Fear 5.7%
Angry 2.2%
Happy 2.1%
Surprised 2%
Confused 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Guitar 98.7%
Wheel 98%

Captions

Microsoft

a vintage photo of a person 82.4%
a vintage photo of a person 75.1%
a vintage photo of a person standing in front of a sign 72.3%

Text analysis

Amazon

New
The New Foundation
Foundation
The

Google

Foundation
pwwwwwwwww New Foundation
New
pwwwwwwwww