Human Generated Data

Title

Untitled (baby posed on blanketed table with mother watching)

Date

1949

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9304

Human Generated Data

Title

Untitled (baby posed on blanketed table with mother watching)

People

Artist: Martin Schweig, American 20th century

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9304

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Clothing 99.6
Apparel 99.6
Furniture 99
Person 98.9
Human 98.9
Person 91.2
Female 85.1
Couch 75.6
Home Decor 72.5
Face 71.7
Woman 69.1
Portrait 67.2
Photography 67.2
Photo 67.2
People 63.3
Curtain 62.8
Fashion 61.8
Gown 61.5
Chair 58.8
Robe 57.4

Clarifai
created on 2023-10-27

people 98.9
monochrome 97
man 96
child 95.3
woman 93.9
music 91.6
wear 90.6
adult 90.1
movie 89.3
art 86.4
costume 86.4
furniture 84
vector 81.3
portrait 81
two 80.4
family 80.1
retro 79.8
curtain 79.7
illustration 78.2
boy 78.1

Imagga
created on 2022-01-23

grand piano 55
percussion instrument 54.1
piano 44
musical instrument 42.1
stringed instrument 37.9
keyboard instrument 33.9
marimba 24.5
people 16.2
person 15.2
studio couch 15
relaxation 14.2
interior 13.3
lifestyle 13
room 12.5
man 12.1
convertible 12
happiness 11.7
furniture 11.7
leisure 11.6
sofa 11.6
water 11.3
holiday 10.7
travel 10.6
adult 10.4
women 10.3
happy 10
seat 9.8
modern 9.8
fashion 9.8
art 9.8
cheerful 9.7
portrait 9.7
luxury 9.4
relax 9.3
clothing 9.2
male 9.2
smile 8.5
resort 8.4
house 8.4
wood 8.3
fun 8.2
vacation 8.2
work 8.1
negative 7.9
attractive 7.7
old 7.7
hotel 7.6
casual 7.6
elegance 7.5
outdoors 7.5
window 7.3
smiling 7.2
body 7.2
home 7.2
cute 7.2
indoors 7
sky 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 95.2
black and white 92.9
human face 67.5
person 62.3
clothing 57.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 66.3%
Surprised 63.8%
Happy 34.4%
Sad 0.4%
Confused 0.3%
Calm 0.3%
Angry 0.3%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 16-24
Gender Male, 100%
Surprised 96.1%
Calm 2.5%
Fear 0.6%
Disgusted 0.3%
Confused 0.2%
Angry 0.1%
Happy 0.1%
Sad 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Text analysis

Amazon

KODУK-SEELA
wases