Human Generated Data

Title

Untitled (girl sitting at little table holding can)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17653

Human Generated Data

Title

Untitled (girl sitting at little table holding can)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17653

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 97.6
Human 97.6
Blonde 94.3
Female 94.3
Girl 94.3
Kid 94.3
Woman 94.3
Teen 94.3
Child 94.3
Face 92.6
Chair 91.2
Furniture 91.2
Clothing 84.2
Apparel 84.2
Shoe 77
Footwear 77
Person 76.5
Indoors 69.4
Leisure Activities 66.7
Portrait 66.3
Photography 66.3
Photo 66.3
Table 66.2
Room 61.8
Play 58.9
Floor 58.6
Dress 57

Clarifai
created on 2023-10-28

people 99.8
child 99.2
one 98.1
monochrome 98
furniture 94.1
boy 93.6
wear 93.5
adult 93.3
music 93.1
girl 91.3
two 90.7
portrait 90.4
woman 90.4
indoors 89.8
recreation 88.4
actress 87.9
musician 87.5
street 87
chair 86.6
family 86

Imagga
created on 2022-02-26

person 27.7
adult 27.1
teacher 22.5
black 21.1
people 20.1
sexy 18.5
portrait 18.1
fashion 18.1
educator 17.7
professional 16.4
man 16.1
sitting 15.5
pretty 14.7
dress 14.4
attractive 13.3
model 13.2
body 12.8
human 12.7
women 12.6
lady 12.2
face 12.1
elegance 11.7
lifestyle 11.6
indoors 11.4
crutch 11.3
sport 11.2
style 11.1
hair 11.1
city 10.8
blackboard 10.6
studio 10.6
urban 10.5
clothing 10
cleaner 10
smile 10
one 9.7
smiling 9.4
casual 9.3
male 9.3
outdoor 9.2
vintage 9.1
holding 9.1
posing 8.9
life 8.8
looking 8.8
staff 8.8
play 8.6
cute 8.6
elegant 8.6
stick 8.5
legs 8.5
skin 8.5
chair 8.4
old 8.4
house 8.4
camera 8.3
active 8.3
silhouette 8.3
musical instrument 8.2
happy 8.1
light 8
interior 8
art 7.9
business 7.9
world 7.8
men 7.7
youth 7.7
hand 7.6
dark 7.5
leisure 7.5
room 7.4
music 7.3
sensual 7.3
gorgeous 7.2
home 7.2
cool 7.1
blond 7.1

Microsoft
created on 2022-02-26

text 97.5
person 90.3
clothing 87.9
window 81.6
black and white 81.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 6-16
Gender Male, 98.4%
Surprised 46.1%
Happy 44%
Calm 7.2%
Disgusted 0.8%
Confused 0.6%
Fear 0.5%
Angry 0.4%
Sad 0.4%

Feature analysis

Amazon

Person
Shoe
Person 97.6%
Person 76.5%
Shoe 77%

Captions

Microsoft
created on 2022-02-26

a person standing in front of a window 52.5%

Text analysis

Amazon

eyland
T33A2

Google

eyland
eyland