Human Generated Data

Title

Untitled (woman and baby looking in large mirror while sitting on blanket on floor)

Date

1947

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9192

Human Generated Data

Title

Untitled (woman and baby looking in large mirror while sitting on blanket on floor)

People

Artist: Martin Schweig, American 20th century

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9192

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 98.7
Human 98.7
Person 97.2
Person 96.3
Baby 91.2
Clothing 87.4
Apparel 87.4
Person 82.5
Female 72.9
Kid 69.9
Child 69.9
Furniture 69.7
Face 68.5
Play 67.4
People 66.2
Home Decor 65
Girl 63.5
Poster 62.6
Advertisement 62.6
Photo 61.4
Photography 61.4
Text 60.8
Silhouette 57.7
Outdoors 55.8
Floor 55.8

Clarifai
created on 2023-10-27

people 99.9
child 99.5
woman 97
group 96.5
adult 96.2
actress 95.8
two 95.7
dancing 95.7
dancer 95.4
music 94
monochrome 93.9
man 93.1
offspring 92.9
girl 91.9
portrait 91.6
boy 91.6
recreation 91.3
group together 91.2
baby 91.1
movie 90.9

Imagga
created on 2022-01-23

person 41.3
people 27.9
adult 27.5
laptop 25.1
sitting 21.5
computer 21
lifestyle 20.2
man 20.2
smiling 19.5
happy 19.4
male 18.4
business 18.2
attractive 18.2
hair 17.4
portrait 16.8
smile 16.4
television 16.1
patient 15.7
one 15.7
model 15.6
casual 15.2
fashion 15.1
fun 15
clothing 14.8
cheerful 14.6
women 14.2
indoors 14.1
pretty 14
office 13.8
professional 13.7
businesswoman 13.6
brunette 13.1
case 13
work 12.6
equipment 12.4
world 12.2
sick person 12.1
men 12
style 11.9
working 11.5
sexy 11.2
outdoors 11.2
home 11.2
job 10.6
jeans 10.5
human 10.5
active 10.5
telecommunication system 10.4
body 10.4
technology 10.4
corporate 10.3
table 10
face 9.9
studio 9.9
modern 9.8
lady 9.7
blond 9.6
looking 9.6
happiness 9.4
communication 9.2
exercise 9.1
black 9
sport 9
success 8.8
room 8.7
using 8.7
cute 8.6
floor 8.4
joy 8.4
health 8.3
teenager 8.2
music 8.1
suit 8.1
worker 8
businessman 7.9
mother 7.9
urban 7.9
notebook 7.8
elegant 7.7
youth 7.7
senior 7.5
leisure 7.5
guitar 7.5
performer 7.4
device 7.4
activity 7.2
handsome 7.1
interior 7.1
day 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 95.2
text 94.9
black and white 90
clothing 83.1
toddler 74.3
baby 61.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Female, 79%
Happy 92.3%
Calm 1.9%
Fear 1.6%
Surprised 1.5%
Sad 1.2%
Disgusted 0.6%
Angry 0.5%
Confused 0.4%

AWS Rekognition

Age 9-17
Gender Male, 100%
Calm 98.9%
Surprised 0.9%
Confused 0%
Disgusted 0%
Sad 0%
Happy 0%
Fear 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%

Captions