Human Generated Data

Title

Untitled (women in a line practicing can-can dance on stage)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15258

Human Generated Data

Title

Untitled (women in a line practicing can-can dance on stage)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15258

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 96.8
Human 96.8
Art 93.9
Person 91.9
Drawing 90.6
Person 90
Person 89.6
Person 85.7
Person 85.2
Person 84.8
Clothing 82.8
Apparel 82.8
Person 79
Horse 76.5
Animal 76.5
Mammal 76.5
Sketch 73.3
Person 64.7

Clarifai
created on 2023-10-29

people 99.8
man 97.5
adult 97
illustration 95.1
group 94
monochrome 92.8
woman 92.5
street 88.1
art 80.8
print 80.5
family 80.1
child 79.9
home 78.2
classic 76
administration 75.7
vintage 74
music 73.9
education 73.6
leader 72.8
theater 70.6

Imagga
created on 2022-03-05

architecture 33.7
building 31.9
newspaper 25.8
barbershop 25.4
sculpture 25.4
sketch 25.3
city 24.1
shop 22.4
history 21.5
statue 21.1
old 20.9
product 20.5
drawing 20
tourism 19
landmark 19
travel 18.3
art 17.8
mercantile establishment 17.1
creation 17.1
historical 16.9
monument 16.8
house 16.8
historic 16.5
column 16.5
daily 16
famous 15.8
representation 15.8
structure 15.5
stone 15.2
culture 14.5
ancient 13.8
marble 13.2
balcony 12.6
place of business 11.8
window 11.7
chair 10.5
urban 10.5
fountain 10.1
water 10
tourist 10
vintage 9.9
room 9.6
antique 9.5
library 8.9
architectural 8.6
god 8.6
design 8.4
classic 8.3
traditional 8.3
street 8.3
university 8
decoration 7.9
classroom 7.9
artistic 7.8
baroque 7.8
people 7.8
roman 7.8
portrait 7.8
luxury 7.7
detail 7.2
life 7.2

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

building 99.2
text 98.2
house 78.9
drawing 71.7
person 50.4
old 41.1
store 40.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Male, 60.2%
Calm 97.1%
Sad 1.4%
Surprised 0.4%
Confused 0.4%
Disgusted 0.2%
Angry 0.2%
Happy 0.2%
Fear 0.1%

AWS Rekognition

Age 23-33
Gender Male, 98.2%
Calm 99%
Happy 0.4%
Surprised 0.2%
Angry 0.1%
Confused 0.1%
Disgusted 0.1%
Sad 0.1%
Fear 0%

AWS Rekognition

Age 24-34
Gender Female, 82.3%
Calm 96.6%
Surprised 2.4%
Happy 0.3%
Confused 0.3%
Sad 0.2%
Disgusted 0.2%
Angry 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Horse
Person 96.8%
Person 91.9%
Person 90%
Person 89.6%
Person 85.7%
Person 85.2%
Person 84.8%
Person 79%
Person 64.7%
Horse 76.5%

Categories

Text analysis

Amazon

5
KODAK
KODAK SAFETY
SAFETY
FILM
AK SAFETY FILM
AK

Google

AK S'AFETY FILM KODAK S'AFETY
AK
S'AFETY
FILM
KODAK