Human Generated Data

Title

Untitled (man and woman sitting with dog)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17013

Human Generated Data

Title

Untitled (man and woman sitting with dog)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17013

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Chair 98.7
Furniture 98.7
Person 98.6
Human 98.6
Dog 97.4
Mammal 97.4
Animal 97.4
Canine 97.4
Pet 97.4
Person 93.5
Clothing 87.7
Apparel 87.7
Room 69.4
Indoors 69.4
Text 64.9
Flooring 62.5
Female 60.9
People 59.8
Sitting 59.7
Clinic 59.4
Living Room 58.1
Girl 56.4
Home Decor 55.3

Clarifai
created on 2023-10-28

people 99.9
group 99.4
adult 98.9
man 98.1
leader 97.4
group together 96.6
administration 95.5
woman 94.2
gown (clothing) 91.4
medical practitioner 91
two 90.8
book series 88.8
three 88.1
furniture 88
several 87.5
education 86.1
sit 85.7
chair 81.6
wear 81.2
actor 80.8

Imagga
created on 2022-02-26

man 33.6
people 29
dalmatian 27.2
person 25.8
wheelchair 23.9
dog 22.5
chair 20.6
male 19.9
adult 19.2
room 18.4
domestic animal 17.3
sitting 17.2
canine 16.9
lifestyle 16.6
women 16.6
happy 16.3
portrait 15.5
indoors 14.9
seat 14.5
home 13.5
senior 13.1
smiling 13
men 12.9
smile 12.8
to 11.5
cheerful 11.4
sport 10.9
family 10.7
human 10.5
outdoors 10.4
boy 10.4
furniture 10.1
indoor 10
fun 9.7
together 9.6
couple 9.6
day 9.4
casual 9.3
mature 9.3
house 9.2
patient 9.1
relaxing 9.1
team 9
musical instrument 8.8
hospital 8.8
businessman 8.8
happiness 8.6
elderly 8.6
relax 8.4
health 8.3
sax 8.3
student 8.2
care 8.2
girls 8.2
lady 8.1
looking 8
automaton 8
interior 8
medical 7.9
disabled 7.9
sick 7.7
attractive 7.7
youth 7.7
outdoor 7.6
husband 7.6
illness 7.6
reading 7.6
leisure 7.5
holding 7.4
teenager 7.3
business 7.3
mother 7.3
office 7.2
computer 7.2
world 7.1
face 7.1

Google
created on 2022-02-26

Photograph 94.2
Black 89.5
Black-and-white 84.5
Style 83.9
Table 81.9
Chair 77.3
Couch 76.4
Monochrome 73.8
Monochrome photography 73.5
Art 70.7
Coffee table 69.2
Vintage clothing 69.1
Sitting 67.7
Suit 63.5
Room 62.7
Living room 62.3
Games 62
Classic 61.6
Recreation 57.5
Curtain 56.4

Microsoft
created on 2022-02-26

table 95.5
furniture 92.5
indoor 90.9
text 84
person 75.2
clothing 64.2
chair 61.8
room 45

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Female, 54.7%
Calm 87.8%
Disgusted 5.2%
Confused 2.5%
Surprised 2.2%
Angry 0.8%
Happy 0.6%
Sad 0.5%
Fear 0.5%

AWS Rekognition

Age 30-40
Gender Female, 75.1%
Calm 33.1%
Fear 25.1%
Happy 14.7%
Sad 9.4%
Surprised 6.4%
Angry 3.9%
Disgusted 3.8%
Confused 3.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair
Person
Dog
Chair 98.7%
Person 98.6%
Person 93.5%
Dog 97.4%

Categories

Imagga

interior objects 99.3%

Text analysis

Amazon

5
KODA-A-IW

Google

MJI7- - YT3RA°2- - XAGOX
MJI7-
-
YT3RA°2-
XAGOX