Human Generated Data

Title

Untitled (portrait of man and three girls)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17019

Human Generated Data

Title

Untitled (portrait of man and three girls)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17019

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99
Human 99
Person 98.7
Person 98.1
Person 95.7
Clothing 89.3
Apparel 89.3
Furniture 73.4
Table Lamp 73.2
Lamp 73.2
Tie 68.3
Accessories 68.3
Accessory 68.3
Face 67.1
Hair 63.2
Female 63
Portrait 62.3
Photography 62.3
Photo 62.3
Couch 57.8
Blonde 56.7
Teen 56.7
Woman 56.7
Kid 56.7
Girl 56.7
Child 56.7
Tie 55.6
Suit 55.5
Coat 55.5
Overcoat 55.5

Clarifai
created on 2023-10-29

people 99.7
monochrome 99.1
man 97.3
child 96.9
woman 96.6
family 94.7
adult 94.5
science 94.3
hospital 93.7
education 91.4
group 91.3
indoors 91.3
two 88.7
medical practitioner 85.8
portrait 85.1
healthcare 84.1
baby 83.5
three 83
interaction 82.6
couple 81.3

Imagga
created on 2022-02-26

person 43.3
man 37
senior 35.6
people 32.9
adult 32.5
male 29.1
lab coat 29
elderly 28.7
happy 25.7
portrait 24.6
groom 24.6
couple 24.4
mature 24.2
coat 23.5
old 20.9
businessman 20.3
professional 19.2
looking 18.4
smile 17.8
patient 17.8
medical 17.7
sitting 17.2
doctor 16.9
smiling 16.6
student 16.6
retirement 16.3
specialist 16
room 15.6
education 15.6
husband 15.4
women 15
nurse 15
indoors 14.9
lady 14.6
health 14.6
retired 14.5
worker 14.4
home 14.4
human 14.2
wife 14.2
work 14.1
holding 14
together 14
business 14
face 13.5
teacher 13.2
lifestyle 13
men 12.9
occupation 12.8
laptop 12.8
technology 12.6
office 12
hospital 12
garment 12
clothing 11.9
indoor 11.9
love 11.8
aged 11.8
suit 11.7
hair 11.1
happiness 11
gray 10.8
team 10.8
older 10.7
family 10.7
working 10.6
grandfather 10.5
tie 10.4
computer 10.4
case 10.2
two 10.2
executive 10.1
care 9.9
hand 9.9
handsome 9.8
pretty 9.8
to 9.7
medicine 9.7
group 9.7
married 9.6
table 9.5
pensioner 9.4
expression 9.4
equipment 9.4
glasses 9.3
life 9
explaining 8.9
job 8.8
60s 8.8
clinic 8.6
serious 8.6
age 8.6
grandma 8.5
casual 8.5
manager 8.4
teamwork 8.3
camera 8.3
sick person 8.3
one 8.2
look 7.9
grandmother 7.8
corporate 7.7
concentration 7.7
modern 7.7
attractive 7.7
illness 7.6
uniform 7.6
desk 7.6
necktie 7.6
stethoscope 7.6
positive 7.4
successful 7.3
cheerful 7.3
alone 7.3
entrepreneur 7.1
day 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 98.7
wall 96.9
person 91.3
human face 88.1
clothing 85.7
black and white 73.7
man 64.1
old 53.7
posing 41.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Male, 100%
Calm 94.9%
Confused 2.3%
Happy 2.3%
Disgusted 0.2%
Surprised 0.1%
Sad 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 35-43
Gender Female, 59.9%
Calm 49.3%
Sad 32.7%
Happy 12.3%
Surprised 2.4%
Confused 1%
Disgusted 0.9%
Angry 0.8%
Fear 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Tie
Person 99%
Person 98.7%
Person 98.1%
Person 95.7%
Tie 68.3%
Tie 55.6%

Categories

Imagga

people portraits 65.8%
paintings art 24.5%
pets animals 7.2%