Human Generated Data

Title

Untitled (woman and baby looking into mirror)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17923

Human Generated Data

Title

Untitled (woman and baby looking into mirror)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17923

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Furniture 98.8
Person 98.8
Human 98.8
Person 98
Person 97.7
Cabinet 88.9
Interior Design 82.3
Indoors 82.3
Dresser 75.2
Drawer 75.1
Room 74.8
Baby 72.6
Monitor 69.3
Electronics 69.3
Display 69.3
Screen 69.3
Living Room 63.6
Photography 62.1
Photo 62.1
Chair 61.8
Portrait 60.3
Face 60.3
Head 60.2
Toy 58.1
Kid 56.7
Child 56.7

Clarifai
created on 2023-10-29

people 99.9
child 99.7
two 98.6
group 97.8
son 97.1
family 96.5
three 96.1
woman 95.1
boy 93.9
man 93.9
adult 93.3
monochrome 92.3
sibling 91.5
nostalgia 90.7
offspring 90.5
baby 90
four 88.6
mirror 87.5
wear 87.1
indoors 85.7

Imagga
created on 2022-03-04

man 31.6
senior 30
people 29
person 28.1
computer 28.1
blackboard 26.6
laptop 24.6
male 23.4
monitor 21.5
adult 21.4
home 19.9
work 19.6
television 19.1
smiling 18.8
happy 18.8
elderly 18.2
old 18.1
technology 17.8
office 17.7
looking 17.6
sitting 17.2
portrait 16.8
retirement 16.3
business 15.8
case 15.7
couple 15.7
education 15.6
retired 15.5
furniture 15.2
indoors 14.9
mature 14.9
desk 13.8
lady 13.8
indoor 13.7
student 13.6
smile 13.5
equipment 13.4
room 13.3
working 12.4
wife 12.3
electronic equipment 11.7
school 11.7
husband 11.4
classroom 10.9
businessman 10.6
file 9.9
team 9.8
modern 9.8
professional 9.8
worker 9.8
job 9.7
notebook 9.5
table 9.4
businesswoman 9.1
one 9
support 8.9
lifestyle 8.7
learn 8.5
casual 8.5
attractive 8.4
teacher 8.3
holding 8.2
occupation 8.2
office furniture 8.1
hair 7.9
together 7.9
bright 7.9
telecommunication system 7.8
face 7.8
using 7.7
studying 7.7
two 7.6
living 7.6
career 7.6
study 7.5
success 7.2
broadcasting 7.1
family 7.1
interior 7.1
happiness 7
executive 7
specialist 7
look 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

indoor 96.8
sink 95.8
wall 95.6
text 92.9
person 89.5
bathroom 82.8
old 82.3
black and white 78.3
drawer 68.8
chest of drawers 58
desk 6.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-33
Gender Male, 63.8%
Calm 75.3%
Sad 11.7%
Fear 4.9%
Happy 4.4%
Angry 1.3%
Disgusted 0.9%
Surprised 0.9%
Confused 0.6%

Feature analysis

Amazon

Person
Person 98.8%
Person 98%
Person 97.7%

Categories

Text analysis

Amazon

17
1
VAGOY
VTR

Google

17
17