Human Generated Data

Title

Untitled (women in a line practicing can-can dance on stage)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15257

Human Generated Data

Title

Untitled (women in a line practicing can-can dance on stage)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15257

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.6
Human 99.6
Person 98
Clothing 97.2
Apparel 97.2
Horse 91.1
Animal 91.1
Mammal 91.1
Brick 82.9
Shorts 79.5
Person 78.3
Housing 76.9
Building 76.9
Suit 74.5
Coat 74.5
Overcoat 74.5
Urban 74.1
Person 72.2
Outdoors 70.7
People 67.2
City 67
Town 67
Person 66.6
Drawing 60.9
Art 60.9
Shoe 59.5
Footwear 59.5
High Rise 59.5
Nature 57.9
Female 57.9
Pants 56.8
Person 42.1

Clarifai
created on 2023-10-29

people 99.7
adult 97.1
man 95.6
woman 94.6
education 93.4
group 92.5
street 89.3
monochrome 89
school 87.8
wear 85.6
group together 85.1
two 83.6
indoors 83.3
room 83
furniture 82.2
child 81.3
one 79.3
many 77.3
home 76.7
adolescent 76.4

Imagga
created on 2022-03-05

shoe shop 100
shop 100
mercantile establishment 81.3
place of business 54.2
establishment 27.3
building 25.6
city 19.1
architecture 18.1
structure 17.2
library 16.7
urban 16.6
old 16
interior 15.9
people 15.6
window 14.8
fashion 14.3
house 13.4
adult 11.6
indoors 11.4
travel 11.3
style 11.1
chair 10.5
sculpture 10.5
room 10.2
stone 10.1
person 10.1
historic 10.1
design 9.6
scene 9.5
women 9.5
sitting 9.4
history 8.9
clothing 8.9
man 8.7
statue 8.6
two 8.5
art 8.5
portrait 8.4
wood 8.3
street 8.3
vintage 8.3
tourism 8.2
one 8.2
life 8.1
religion 8.1
black 7.8
decoration 7.3
business 7.3
home 7.2
sidewalk 7.1
steel 7.1
modern 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 97
horse 95.1
black and white 85.6
animal 78.9
black 66.5
white 63.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Female, 89.2%
Calm 39.1%
Disgusted 23.3%
Happy 15.7%
Angry 8.4%
Sad 5%
Surprised 3.6%
Confused 3.5%
Fear 1.4%

AWS Rekognition

Age 23-31
Gender Female, 60.1%
Calm 99.8%
Disgusted 0.1%
Happy 0%
Confused 0%
Surprised 0%
Angry 0%
Sad 0%
Fear 0%

AWS Rekognition

Age 41-49
Gender Male, 96.9%
Calm 99.8%
Sad 0.1%
Disgusted 0.1%
Fear 0%
Surprised 0%
Confused 0%
Angry 0%
Happy 0%

AWS Rekognition

Age 23-31
Gender Male, 65%
Calm 95.8%
Sad 1%
Surprised 1%
Happy 0.7%
Confused 0.6%
Angry 0.5%
Disgusted 0.4%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Horse
Shoe
Person 99.6%
Person 98%
Person 78.3%
Person 72.2%
Person 66.6%
Person 42.1%
Horse 91.1%
Shoe 59.5%

Categories

Text analysis

Amazon

KODAK
SAFETY
KODAK SAFETY FILM
FILM
3
в
In

Google

3. KODAK S'AFETY FILM KODAK
3.
KODAK
S'AFETY
FILM