Human Generated Data

Title

Untitled (band playing on mobile stage in field - parking lot)

Date

c. 1975

People

Artist: Ken Whitmire Associates, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1383

Human Generated Data

Title

Untitled (band playing on mobile stage in field - parking lot)

People

Artist: Ken Whitmire Associates, American

Date

c. 1975

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1383

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Musician 99.8
Human 99.8
Musical Instrument 99.8
Person 99.3
Stage 98.8
Guitar 98.7
Leisure Activities 98.7
Person 98.6
Machine 98
Wheel 98
Guitar 97.6
Person 94.5
Music Band 91.5
Guitarist 87.3
Performer 87.3
Drum 73.8
Percussion 73.8
Drummer 56.2

Clarifai
created on 2023-10-26

people 99.9
vehicle 98.9
monochrome 98.2
adult 97.6
group 97.4
group together 96.8
street 96.7
transportation system 95.7
many 94.5
man 92.4
woman 92.2
vintage 90.9
driver 85.2
retro 81.8
recreation 81.5
art 81.2
several 79.1
wear 77.8
one 76.9
black and white 74.6

Imagga
created on 2022-01-22

musical instrument 35.1
percussion instrument 25.9
sky 17.2
steel drum 17.1
stall 16.1
building 14
accordion 13.9
keyboard instrument 13.3
landscape 12.6
city 12.5
travel 12
old 11.8
sea 11.7
architecture 11.7
people 11.7
house 11.7
outdoor 11.5
vehicle 11.1
beach 11
industrial 10.9
water 10.7
stage 10.2
ocean 10
wind instrument 9.8
vacation 9.8
mountain 9.8
car 9.5
platform 9.2
stone 9.1
marimba 9
summer 9
sand 8.7
rock 8.7
industry 8.5
clouds 8.4
seller 8.2
transportation 8.1
history 8
farm 8
holiday 7.9
urban 7.9
sunny 7.7
seascape 7.6
ashcan 7.6
freight car 7.6
field 7.5
hill 7.5
resort 7.5
tourism 7.4
man 7.4
dirty 7.2
sunset 7.2
coast 7.2
wheeled vehicle 7.1
night 7.1
male 7.1
work 7.1
bin 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.3
black 81
person 79.7
black and white 76.8
white 67.4
old 50.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Female, 76.1%
Calm 41.4%
Disgusted 25.1%
Fear 11.5%
Angry 7.7%
Sad 5%
Confused 4.9%
Surprised 2.9%
Happy 1.4%

AWS Rekognition

Age 34-42
Gender Male, 100%
Calm 44.6%
Disgusted 25%
Fear 19.1%
Surprised 3.8%
Happy 2.9%
Confused 2.8%
Sad 1.1%
Angry 0.7%

AWS Rekognition

Age 22-30
Gender Male, 99.4%
Sad 51.4%
Calm 28%
Disgusted 7.3%
Fear 5.7%
Angry 2.2%
Happy 2.1%
Surprised 2%
Confused 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Guitar 98.7%
Wheel 98%

Categories

Imagga

cars vehicles 99.7%

Text analysis

Amazon

New
The New Foundation
Foundation
The

Google

pwwwwwwwww
Foundation
pwwwwwwwww New Foundation
New