Human Generated Data

Title

Untitled (young girl posing in dress in hallway with stairs and dresser in background)

Date

1940-1955

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9057

Human Generated Data

Title

Untitled (young girl posing in dress in hallway with stairs and dresser in background)

People

Artist: Martin Schweig, American 20th century

Date

1940-1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Clothing 99.2
Apparel 99.2
Person 98.6
Human 98.6
Lamp 80.1
Table Lamp 78.6
Door 76.1
Furniture 69.7
Chair 69.7
Living Room 59.1
Indoors 59.1
Room 59.1
Lampshade 57.4

Imagga
created on 2022-01-23

people 29.5
person 23.5
man 22.3
adult 19.5
male 19.1
business 18.2
city 17.4
men 17.2
happy 16.9
smiling 16.6
urban 16.6
modern 15.4
lifestyle 15.2
building 14.9
locker 14.9
portrait 14.2
sport 13.5
attractive 12.6
health 12.5
equipment 12.3
office 12.3
cheerful 12.2
training 12
device 12
fastener 11.9
smile 11.4
standing 11.3
strength 11.2
looking 11.2
pretty 11.2
corporate 11.2
women 11.1
black 10.9
fitness 10.8
handsome 10.7
weight 10.6
businessman 10.6
fun 10.5
active 10.5
couple 10.4
exercise 10
music 9.9
team 9.8
gym 9.6
professional 9.5
happiness 9.4
action 9.3
fit 9.2
joy 9.2
fashion 9
restraint 9
blurred 8.6
play 8.6
casual 8.5
silhouette 8.3
holding 8.2
human 8.2
indoor 8.2
architecture 8.2
group 8.1
sexy 8
musical instrument 8
life 7.9
indoors 7.9
player 7.9
day 7.8
brunette 7.8
physical 7.7
train 7.7
crowd 7.7
expression 7.7
two 7.6
healthy 7.6
bag 7.5
club 7.5
one person 7.5
one 7.5
blur 7.4
technology 7.4
speed 7.3
window 7.3
competition 7.3
playing 7.3
confident 7.3
body 7.2
cute 7.2
clothing 7.2
interior 7.1
work 7.1
together 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 98.7
window 97.1
clothing 94.8
footwear 93.8
person 86.6
dress 77.9
black and white 65.2
computer 53.2

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 92.9%
Happy 98.7%
Surprised 0.9%
Calm 0.1%
Sad 0.1%
Angry 0.1%
Fear 0.1%
Confused 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.6%

Captions

Microsoft

a person standing in front of a window 75%
a person standing in front of a building 74.9%
a person standing in front of a window 64.2%

Text analysis

Amazon

17150
KODAAA--EITW

Google

MJ7--YT33A°2--XAa
MJ7--YT33A°2--XAa