Human Generated Data

Title

Untitled (people watching dance performance)

Date

c. 1950

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20224

Human Generated Data

Title

Untitled (people watching dance performance)

People

Artist: Peter James Studio, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20224

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99
Human 99
Person 98.8
Person 97.4
Person 96.8
Restaurant 96
Person 92.7
Person 91.3
Cafe 85.2
Person 83.8
Person 82.8
Meal 80.7
Food 80.7
Leisure Activities 79.9
Cafeteria 75.7
Person 74
Interior Design 70.8
Indoors 70.8
Crowd 67.4
Person 62.4
People 61.7
Room 61.1
Theme Park 58
Amusement Park 58
Food Court 55.6
Musician 55
Musical Instrument 55
Person 48.8

Clarifai
created on 2023-10-22

people 99.9
group 99.8
many 99.2
music 98
woman 97.7
man 96.4
crowd 96.1
group together 95.2
adult 95
furniture 94.6
musician 91.9
recreation 91.2
indoors 90.6
dancing 90.5
several 90.4
monochrome 90.3
audience 90.2
child 90
administration 88.2
leader 88.1

Imagga
created on 2022-03-05

stage 39.2
bass 35.6
music 32.5
musician 26.7
platform 24.6
guitar 22.1
singer 21.4
concert 21.3
performer 19.5
silhouette 19
musical 18.2
person 17
crowd 16.3
black 16.2
people 16.2
group 16.1
rock 15.6
man 15.4
grunge 15.3
club 15.1
musical instrument 15
fun 15
male 14.9
party 14.6
performance 14.3
stringed instrument 14
disco 13.5
instrument 13.1
dancer 13
play 12.9
studio 12.9
dance 12.5
salon 12.3
guitarist 11.8
hand 11.4
player 11.3
art 11.1
entertainment 11
adult 10.4
sound 10.3
entertainer 10.3
event 10.2
nightclub 9.8
night 9.8
band 9.7
style 9.6
urban 9.6
design 9.6
women 9.5
show 9.5
men 9.4
light 9.3
bowed stringed instrument 8.9
sing 8.8
body 8.8
microphone 8.7
star 8.4
smoke 8.4
playing 8.2
symbol 8.1
decoration 8
outfit 7.9
world 7.9
nightlife 7.8
drawing 7.8
melody 7.8
color 7.8
modern 7.7
dancing 7.7
youth 7.7
friends 7.5
city 7.5
leisure 7.5
life 7.3
graphic 7.3

Google
created on 2022-03-05

Photograph 94.2
White 92.2
Black 89.9
Black-and-white 86.5
Organism 85.2
Style 84.1
Hat 80.6
Art 78.6
Monochrome 78.1
Monochrome photography 77.8
People 77.8
Snapshot 74.3
Event 74
Font 73.6
Crowd 72.9
Suit 69.9
Chair 69.3
Room 66.8
Stock photography 64.1
Illustration 61.3

Microsoft
created on 2022-03-05

text 96.2
person 90.3
clothing 87.2
cartoon 81.3
drawing 68.5
crowd 23.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Female, 53.4%
Sad 78.1%
Calm 9.4%
Fear 7.3%
Angry 2%
Confused 1.5%
Happy 1%
Disgusted 0.5%
Surprised 0.3%

AWS Rekognition

Age 19-27
Gender Female, 95.2%
Calm 86.8%
Sad 9.9%
Happy 0.9%
Fear 0.8%
Disgusted 0.5%
Angry 0.5%
Confused 0.4%
Surprised 0.2%

AWS Rekognition

Age 24-34
Gender Male, 88.8%
Calm 86.1%
Disgusted 9.8%
Sad 1.6%
Confused 0.9%
Angry 0.7%
Surprised 0.3%
Fear 0.3%
Happy 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person
Person 99%
Person 98.8%
Person 97.4%
Person 96.8%
Person 92.7%
Person 91.3%
Person 83.8%
Person 82.8%
Person 74%
Person 62.4%
Person 48.8%

Text analysis

Amazon

٢ад
DOLO

Google

YT37A°2-XAGO
YT37A°2-XAGO