Human Generated Data

Title

Untitled (dancer on stage)

Date

c. 1950

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20216

Human Generated Data

Title

Untitled (dancer on stage)

People

Artist: Peter James Studio, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20216

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Interior Design 99.4
Indoors 99.4
Person 98.6
Human 98.6
Person 98.5
Person 92.3
Floor 91.7
Living Room 90.5
Room 90.5
Person 87.5
Flooring 83.2
Wood 82.4
Person 78.2
Person 68.9
Home Decor 62.2
People 62
Curtain 60.4
Stage 59.7
Bedroom 59.6
Suit 57.8
Coat 57.8
Overcoat 57.8
Clothing 57.8
Apparel 57.8
Hardwood 57.7
Leisure Activities 56.6

Clarifai
created on 2023-10-22

people 99.8
adult 97.1
wear 96.6
group 96.4
two 95.9
woman 93.5
many 93.2
one 92.9
group together 92.2
furniture 91.8
several 91.8
monochrome 91.4
man 90.2
room 89.6
child 89.1
music 87.7
indoors 86.1
no person 85.5
three 85.4
musician 84.8

Imagga
created on 2022-03-05

column 26.1
passenger 23.3
architecture 22.1
travel 17.6
city 17.4
vehicle 14.6
car 14.1
transport 13.7
transportation 13.4
wheeled vehicle 12
building 11.9
station 11.5
people 11.2
traditional 10.8
tourism 10.7
statue 10.6
train 10.5
ancient 10.4
history 9.8
old 9.7
adult 9.7
structure 9.6
urban 9.6
industry 9.4
monument 9.3
power 9.2
business 9.1
industrial 9.1
dress 9
palace 8.9
man 8.8
deck 8.4
inside 8.3
life 8.1
religion 8.1
sculpture 8
interior 8
gas 7.7
house 7.7
sky 7.6
historical 7.5
vacation 7.4
conveyance 7.4
metal 7.2
landmark 7.2
window 7.2
day 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

black and white 88.1
text 80.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 53-61
Gender Male, 98.3%
Calm 69.9%
Confused 15.2%
Sad 11.7%
Surprised 0.8%
Disgusted 0.7%
Fear 0.7%
Happy 0.5%
Angry 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Suit
Person 98.6%
Person 98.5%
Person 92.3%
Person 87.5%
Person 78.2%
Person 68.9%
Suit 57.8%

Categories

Text analysis

Amazon

٢ад
YТ3A°-X

Google

YT37A°2-XAGO
YT37A°2-XAGO