Human Generated Data

Title

Untitled (two men reflected in mirror in room with dresser, lamp, and wallpaper)

Date

c. 1940

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12726

Human Generated Data

Title

Untitled (two men reflected in mirror in room with dresser, lamp, and wallpaper)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12726

Machine Generated Data

Tags

Amazon
created on 2022-02-04

Furniture 100
Person 98.7
Human 98.7
Crib 96.6
Room 95.7
Indoors 95.7
Person 92.5
Bed 71
Cradle 69.9
Person 56
Nursery 55.4

Clarifai
created on 2023-10-27

people 99
man 96.2
monochrome 93.1
indoors 92.7
one 92.5
room 92.4
adult 92.3
furniture 88.4
veil 87.8
lid 85.4
wear 83.2
window 82.8
woman 79.6
family 79.4
art 79.3
two 77.5
religion 76.7
elderly 75.1
leader 74.9
black and white 74.7

Imagga
created on 2022-02-04

person 31.6
people 24.5
man 22.2
male 21.4
adult 16.4
businessman 15
silhouette 14.1
business 14
happy 13.2
portrait 12.9
symbol 12.1
player 11.6
team 11.6
lights 11.1
black 10.9
blackboard 10.8
cheering 10.8
audience 10.7
sport 10.7
face 10.7
crowd 10.6
child 10.5
pretty 10.5
fun 10.5
nation 10.4
women 10.3
event 10.2
nighttime 9.8
stadium 9.7
sexy 9.6
patriotic 9.6
hair 9.5
lifestyle 9.4
training 9.2
groom 9.2
world 9.2
flag 9.2
television 9.2
attractive 9.1
park 9.1
human 9
work 8.6
happiness 8.6
smile 8.6
expression 8.5
athlete 8.5
design 8.4
manager 8.4
competition 8.2
suit 8.2
style 8.2
disk jockey 8.1
looking 8
icon 7.9
love 7.9
vibrant 7.9
bright 7.9
championship 7.8
match 7.7
skill 7.7
muscular 7.6
casual 7.6
life 7.6
head 7.6
field 7.5
glowing 7.4
cheerful 7.3
lady 7.3
art 7.3
group 7.3
professional 7.2

Google
created on 2022-02-04

Microsoft
created on 2022-02-04

text 98.9
clothing 94.4
man 91.3
person 87.4
black and white 85.2
drawing 76.2
picture frame 9.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-33
Gender Male, 62.3%
Calm 90.9%
Sad 5.1%
Happy 1.1%
Confused 0.8%
Fear 0.7%
Surprised 0.5%
Angry 0.4%
Disgusted 0.3%

Feature analysis

Amazon

Person
Person 98.7%
Person 92.5%
Person 56%

Text analysis

Amazon

onn
IR825P onn
IR825P