Human Generated Data

Title

Untitled (family portrait in living room)

Date

1965

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17618

Human Generated Data

Title

Untitled (family portrait in living room)

People

Artist: Lucian and Mary Brown, American

Date

1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17618

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.7
Human 99.7
Person 99.6
Person 99.5
Person 98.1
Person 97.5
Clothing 84.4
Apparel 84.4
Clinic 83
Furniture 75.2
Room 72
Indoors 72
People 69.4
Chair 62.1
Living Room 55.3

Clarifai
created on 2023-10-29

people 99.9
adult 98.9
man 98.4
group 98.4
woman 97.9
furniture 97.9
sit 97.8
elderly 97.3
chair 96.9
room 96.8
home 96.7
group together 95.9
leader 95.5
medical practitioner 93.9
two 93.4
administration 92.9
seat 92.7
four 92.6
recreation 92.2
several 91.7

Imagga
created on 2022-02-26

home 30.3
room 27.9
people 25.1
indoors 23.7
interior 23
person 22.8
man 22.8
house 21.7
lifestyle 21.7
adult 18.9
male 17.8
happy 16.3
portrait 15.5
window 14.7
newspaper 14.1
sitting 13.7
clothing 13.7
smile 13.5
dishwasher 13.4
modern 13.3
indoor 12.8
equipment 12.6
shop 12.5
family 12.4
looking 12
pretty 11.9
casual 11.9
product 11.7
furniture 11.4
life 11.3
health 11.1
women 11.1
domestic 11
happiness 11
alone 11
smiling 10.8
chair 10.8
white goods 10.3
creation 10.3
mother 10.2
new 9.7
home appliance 9.6
two 9.3
black 9
appliance 9
one 9
couple 8.7
child 8.7
men 8.6
mature 8.4
camera 8.3
fashion 8.3
office 8.2
style 8.2
cheerful 8.1
glass 8.1
light 8
love 7.9
work 7.7
attractive 7.7
comfortable 7.6
senior 7.5
fun 7.5
holding 7.4
inside 7.4
mercantile establishment 7.3
business 7.3
color 7.2
bedroom 7.2

Google
created on 2022-02-26

Chair 87.2
Line 81.8
Art 81.8
Monochrome 73.6
Monochrome photography 73.4
Vintage clothing 72.4
Painting 71.8
Room 71
Visual arts 66.8
Sitting 66.6
Classic 66.2
Pattern 64.1
History 62.9
Rectangle 57.9
Collection 52.4
Retro style 50.8

Microsoft
created on 2022-02-26

clothing 96.5
person 95.2
table 91
furniture 89.6
text 88.3
man 87.3
chair 80.1
drawing 74.6
house 72.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Female, 61.7%
Happy 94.9%
Surprised 2.1%
Calm 1.4%
Disgusted 0.6%
Confused 0.3%
Sad 0.2%
Angry 0.2%
Fear 0.2%

AWS Rekognition

Age 28-38
Gender Male, 97.3%
Surprised 49.8%
Happy 34.9%
Fear 3.4%
Disgusted 3%
Angry 2.9%
Confused 2.3%
Calm 1.9%
Sad 1.8%

AWS Rekognition

Age 31-41
Gender Male, 99.8%
Calm 91.3%
Happy 3.8%
Sad 1.8%
Confused 1.5%
Disgusted 0.9%
Surprised 0.4%
Angry 0.2%
Fear 0.2%

AWS Rekognition

Age 31-41
Gender Male, 72.8%
Happy 94.7%
Calm 4.9%
Confused 0.1%
Sad 0.1%
Fear 0.1%
Surprised 0.1%
Disgusted 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Person 99.7%
Person 99.6%
Person 99.5%
Person 98.1%
Person 97.5%
Chair 62.1%

Categories

Imagga

interior objects 99.9%

Text analysis

Amazon

2
s
3

Google

MJI--YT37A°2--XA G
MJI--YT37A°2--XA
G