Human Generated Data

Title

Untitled (two children with book)

Date

c. 1940

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1938

Human Generated Data

Title

Untitled (two children with book)

People

Artist: John Deusing, American active 1940s

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1938

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.3
Human 99.3
Person 99.2
Reading 93.8
Boy 83.4
Baby 73.2
Kid 70
Child 70
Furniture 55.6
Girl 55.2
Female 55.2

Clarifai
created on 2023-10-25

people 99.9
child 99.8
one 97.3
son 95.9
boy 94.9
two 94.4
room 93
wear 91.9
monochrome 90.2
baby 90.1
adult 89.9
indoors 88.4
portrait 88.4
family 87.1
offspring 86.4
man 86.3
administration 86
furniture 84.5
facial expression 83.6
sit 82.5

Imagga
created on 2021-12-14

person 32.9
laptop 30.3
lifestyle 29.6
people 29.6
adult 29
computer 27.3
sitting 25.8
indoors 25.5
man 24.9
attractive 24.5
happy 23.8
smiling 23.1
male 22.9
portrait 22.6
casual 22
home 20.7
pretty 20.3
working 18.6
looking 18.4
indoor 16.4
face 16.3
smile 15.7
office 15.5
business 15.2
relaxation 15.1
senior 15
one 14.9
happiness 14.9
technology 14.8
child 14.7
holding 14
mature 13.9
couple 13.9
cheerful 13.8
hair 13.5
women 13.4
bath 13.3
men 12.9
bathroom 12.8
bathtub 12.8
care 12.3
groom 12.3
vessel 12.1
room 12
modern 11.9
relaxing 11.8
work 11.8
communication 11.8
desk 11.4
health 11.1
tub 11.1
20s 11
alone 11
table 10.9
patient 10.9
businesswoman 10.9
cute 10.8
handsome 10.7
blond 10.6
businessman 10.6
human 10.5
sexy 10.4
businesspeople 10.4
clothing 10.3
skin 10.2
relax 10.1
clean 10
confident 10
job 9.7
executive 9.6
body 9.6
elderly 9.6
professional 9.6
healthy 9.4
one person 9.4
window 9.4
horizontal 9.2
worker 9.1
wet 8.9
lady 8.9
life 8.9
color 8.9
love 8.7
water 8.7
sibling 8.6
sofa 8.6
corporate 8.6
notebook 8.6
house 8.4
leisure 8.3
girls 8.2
spa 8.1
25 30 years 7.8
luxury 7.7
only 7.6
wireless 7.6
windowsill 7.6
sit 7.6
meeting 7.5
joy 7.5
case 7.4
coffee 7.4
occupation 7.3
student 7.2
dress 7.2
team 7.2

Google
created on 2021-12-14

Shirt 94.1
Curious 84.1
Black-and-white 83.8
Toddler 79.7
Window 79.6
Monochrome 73.1
Monochrome photography 72.9
Child 68.8
Room 67
Sitting 64.3
Stock photography 63
T-shirt 60.8
Glass 57
Curtain 56.8
Building 56.7
Fun 55.5
Rectangle 52.5

Microsoft
created on 2021-12-14

person 98.7
toddler 94.5
clothing 94.2
boy 93.7
black and white 90.5
baby 89.3
human face 89
child 58.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 13-23
Gender Male, 77.3%
Angry 79.3%
Calm 11%
Disgusted 3.4%
Sad 1.7%
Happy 1.6%
Confused 1.4%
Surprised 1.1%
Fear 0.4%

AWS Rekognition

Age 23-35
Gender Male, 57.8%
Calm 73.1%
Sad 14.7%
Happy 5.4%
Fear 3.3%
Angry 1.5%
Surprised 0.7%
Disgusted 0.7%
Confused 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Categories

Imagga

paintings art 75.4%
pets animals 21.5%
interior objects 2.7%