Human Generated Data

Title

Education, General: Germany. Berlin. Volkskindergarten: Froebelhaus

Date

1897

People

Artist: Waldemar Franz Herman Titzenthaler, German 1869 - 1937

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.1452

Human Generated Data

Title

Education, General: Germany. Berlin. Volkskindergarten: Froebelhaus

People

Artist: Waldemar Franz Herman Titzenthaler, German 1869 - 1937

Artist: Unidentified Artist,

Date

1897

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.1452

Machine Generated Data

Tags

Amazon
created on 2019-06-04

Human 89.4
Person 86.3
Apparel 83
Clothing 83
Face 76.5
People 74.1
Advertisement 71.1
Person 69.4
Poster 67.6
Indoors 67.5
Room 66.9
Leisure Activities 65
Building 63.1
Collage 62.3
Suit 60
Coat 60
Overcoat 60
Female 55.2

Clarifai
created on 2019-06-04

people 99.6
adult 98.8
furniture 98.5
group 98.3
room 97.1
art 96.5
indoors 96.3
woman 96.3
man 96.1
painting 95.8
one 95.6
illustration 95.4
museum 94.6
portrait 94.5
exhibition 93.4
wear 93.3
print 93.1
two 91.7
picture frame 91
family 88.4

Imagga
created on 2019-06-04

case 53.5
container 23.4
paper 21.2
blank 17.1
envelope 16.9
home 16.8
old 16.7
holding 14.9
wall 13.7
box 13.1
business 12.8
house 12.5
empty 12
furniture 11.9
package 11.7
frame 11.7
interior 11.5
bill 11.4
sketch 11.3
card 11.1
cash 11
drawing 10.9
bank 10.7
stamp 10.4
money 10.2
finance 10.1
message 10
modern 9.8
design 9.8
board 9.8
office 9.7
refrigerator 9.6
brown 9.6
smiling 9.4
happy 9.4
grunge 9.4
light 9.4
home appliance 9.2
note 9.2
vintage 9.1
adult 9.1
room 9
currency 9
indoors 8.8
carton 8.8
day 8.6
retro 8.2
new 8.1
metal 8
space 7.8
white goods 7.7
cardboard 7.7
sign 7.5
appliance 7.5
page 7.4
banking 7.4
aged 7.2
holiday 7.2
financial 7.1
person 7.1
wooden 7

Google
created on 2019-06-04

Microsoft
created on 2019-06-04

scene 100
gallery 99.9
room 99.9
wall 99.4
person 96.7
clothing 89
picture frame 74.6
black and white 71.6
man 71.1
white 61.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-53
Gender Female, 50.1%
Calm 49.6%
Sad 50.2%
Angry 49.5%
Disgusted 49.5%
Confused 49.5%
Happy 49.6%
Surprised 49.5%

AWS Rekognition

Age 20-38
Gender Female, 54.2%
Sad 51.4%
Disgusted 45.7%
Calm 45.3%
Surprised 45.4%
Happy 46.3%
Angry 45.6%
Confused 45.3%

Feature analysis

Amazon

Person 86.3%

Categories

Captions

Microsoft
created on 2019-06-04

a person in a white room 68.8%
a person sitting in a room 64.9%
an old photo of a person 64.8%

Text analysis

Amazon

AN
AN EMERArNCY
EMERArNCY

Google

TEA AN EMERGrNCY
TEA
AN
EMERGrNCY