Human Generated Data

Title

Untitled (women in a line practicing can-can dance on stage)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15259

Human Generated Data

Title

Untitled (women in a line practicing can-can dance on stage)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15259

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Dance Pose 99.8
Leisure Activities 99.8
Person 99.5
Human 99.5
Person 99.5
Person 99.3
Person 98.5
Person 97.5
Clothing 95.9
Apparel 95.9
Person 94.4
Person 88.2
Dance 80.9
Shorts 79
Person 78.4
People 66.5
Female 64.8
Portrait 59.6
Photography 59.6
Face 59.6
Photo 59.6

Clarifai
created on 2023-10-29

people 99.8
man 97.5
adult 96.7
woman 95.5
school 93.7
group together 92.2
adolescent 90.5
education 90.3
group 89.9
boy 88.9
child 88.6
wear 87.3
recreation 84.1
monochrome 83.5
athlete 82.4
portrait 81.9
two 80.9
music 80.1
room 76.8
uniform 76.4

Imagga
created on 2022-03-05

shop 49.5
shoe shop 48.6
mercantile establishment 33.4
city 25.8
building 24.5
place of business 22.3
urban 21
architecture 20.7
people 18.4
sidewalk 17.7
street 16.6
fashion 14.3
old 13.9
business 12.8
structure 12.7
person 12.4
establishment 12
travel 12
life 11.1
adult 11
portrait 11
black 11
stone 11
man 10.7
scene 10.4
women 10.3
landmark 9.9
tourism 9.9
outdoors 9.7
style 9.6
window 9.5
men 9.4
historic 9.2
sculpture 8.8
steps 8.8
station 8.6
model 8.6
male 8.5
shopping 8.3
dress 8.1
transportation 8.1
history 8
column 8
interior 8
lifestyle 7.9
design 7.9
center 7.9
high 7.8
mall 7.8
entrance 7.7
motion 7.7
crowd 7.7
blurred 7.7
casual 7.6
walking 7.6
one 7.5
speed 7.3
statue 7.3
school 7.2
gate 7.2
day 7.1
modern 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 95.5
black and white 88.7
house 83.3
clothing 64.1
person 63.6
dance 51.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 93.2%
Calm 84.5%
Happy 6.8%
Disgusted 3.8%
Angry 1.1%
Confused 1.1%
Sad 1.1%
Surprised 1.1%
Fear 0.4%

AWS Rekognition

Age 37-45
Gender Male, 79.6%
Calm 98.2%
Surprised 0.7%
Sad 0.3%
Fear 0.3%
Angry 0.2%
Disgusted 0.1%
Confused 0.1%
Happy 0.1%

AWS Rekognition

Age 31-41
Gender Male, 92.3%
Calm 97.5%
Sad 1.1%
Angry 0.6%
Fear 0.3%
Confused 0.2%
Happy 0.1%
Surprised 0.1%
Disgusted 0.1%

AWS Rekognition

Age 19-27
Gender Female, 60.3%
Calm 98.8%
Confused 0.5%
Surprised 0.3%
Happy 0.2%
Disgusted 0.1%
Sad 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 28-38
Gender Male, 98.8%
Calm 99.1%
Fear 0.4%
Happy 0.2%
Confused 0.1%
Disgusted 0.1%
Surprised 0.1%
Sad 0%
Angry 0%

AWS Rekognition

Age 18-24
Gender Female, 60.8%
Calm 80.5%
Confused 7.3%
Happy 3.4%
Surprised 3%
Fear 2.4%
Angry 1.3%
Sad 1.1%
Disgusted 0.9%

AWS Rekognition

Age 23-33
Gender Female, 92.2%
Sad 69.8%
Calm 25%
Surprised 1.7%
Angry 1%
Disgusted 0.9%
Fear 0.6%
Confused 0.5%
Happy 0.4%

AWS Rekognition

Age 16-22
Gender Female, 67.1%
Disgusted 43.6%
Fear 30.7%
Calm 7.2%
Confused 5.2%
Surprised 3.9%
Sad 3.6%
Angry 3.1%
Happy 2.8%

AWS Rekognition

Age 33-41
Gender Male, 94.1%
Calm 93.5%
Happy 3%
Disgusted 0.9%
Surprised 0.7%
Angry 0.6%
Fear 0.5%
Confused 0.4%
Sad 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.5%
Person 99.5%
Person 99.3%
Person 98.5%
Person 97.5%
Person 94.4%
Person 88.2%
Person 78.4%

Categories

Text analysis

Amazon

4
KODAK
SAFETY
SAFETY FILM
FILM
C

Google

FILM KODAK SAFETY FILM
FILM
KODAK
SAFETY