Human Generated Data

Title

Social Settlements: United States. Ohio. Cleveland. Hiram House: Hiram House. Cleveland, Ohio: The Model Cottage: At Table.

Date

c. 1903

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.185.2

Human Generated Data

Title

Social Settlements: United States. Ohio. Cleveland. Hiram House: Hiram House. Cleveland, Ohio: The Model Cottage: At Table.

People

Artist: Unidentified Artist,

Date

c. 1903

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.185.2

Machine Generated Data

Tags

Amazon
created on 2019-06-05

Person 98.2
Human 98.2
Person 97.1
Person 92.6
Art 92.3
Painting 92.3
Furniture 91.6
Apparel 80.6
Clothing 80.6
Person 71.3
People 66.7
Indoors 59.9
Room 59.9
Person 59.8
Bed 59
Mosquito Net 56.1

Clarifai
created on 2019-06-05

people 99.9
group 99.7
adult 99.7
furniture 98.4
wear 97.9
man 97.7
many 96.8
woman 95.6
room 94.9
veil 94.2
several 94.1
group together 93
mammal 92.4
seat 92.2
container 91.9
leader 90
administration 89.7
military 88.4
art 87.7
child 87.5

Imagga
created on 2019-06-05

room 21.6
furniture 21.5
interior 21.2
home 20.7
couch 20.3
sofa 18.5
house 18.4
chair 14.4
decor 13.3
pillow 13
decoration 12.6
celebration 12
luxury 11.1
old 11.1
adult 11.1
indoor 11
seat 10.9
cushion 10.9
table 10.9
indoors 10.5
modern 10.5
person 10.3
floor 10.2
wedding 10.1
bedroom 10
clothing 9.6
people 9.5
love 9.5
day 9.4
elegance 9.2
flower 9.2
life 9.1
food 8.9
happy 8.8
man 8.7
holiday 8.6
marriage 8.5
design 8.5
relaxation 8.4
antique 8.3
inside 8.3
dress 8.1
family 8
rest 7.9
bouquet 7.8
sculpture 7.8
sitting 7.7
attractive 7.7
bed 7.7
bride 7.7
apartment 7.7
case 7.6
covering 7.6
living 7.6
fashion 7.5
wood 7.5
style 7.4
20s 7.3
lifestyle 7.2
bakery 7.2
cadaver 7.2
mother 7.1
architecture 7

Google
created on 2019-06-05

Microsoft
created on 2019-06-05

clothing 95.7
person 94.4
human face 67.4
painting 62
old 57.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 17-27
Gender Female, 54.2%
Happy 45.1%
Sad 51.8%
Angry 45.5%
Surprised 45.2%
Calm 46.8%
Confused 45.3%
Disgusted 45.3%

AWS Rekognition

Age 20-38
Gender Female, 54.4%
Disgusted 45.1%
Sad 53.4%
Calm 45.7%
Confused 45.1%
Happy 45.5%
Surprised 45.1%
Angry 45.1%

AWS Rekognition

Age 26-43
Gender Female, 54%
Calm 45.1%
Surprised 45%
Confused 45%
Sad 54.7%
Angry 45.1%
Disgusted 45%
Happy 45%

AWS Rekognition

Age 26-43
Gender Female, 54.1%
Disgusted 45.1%
Confused 45.2%
Happy 46.8%
Calm 45.2%
Surprised 45.1%
Sad 52.5%
Angry 45.2%

AWS Rekognition

Age 26-43
Gender Female, 52.4%
Calm 45.9%
Disgusted 45.4%
Happy 47.9%
Surprised 45.3%
Sad 49.5%
Angry 45.8%
Confused 45.1%

AWS Rekognition

Age 45-66
Gender Male, 50.7%
Angry 45.2%
Disgusted 45.1%
Calm 53.2%
Surprised 45.2%
Happy 45.1%
Sad 46.1%
Confused 45.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.2%
Painting 92.3%

Categories

Imagga

interior objects 97.5%
paintings art 1.5%

Captions

Text analysis

Amazon

Toble
ottage.
At Toble
At
S 25

Google

S 2 5 ottage Af Table.
S
2
5
ottage
Af
Table.