Human Generated Data

Title

Charity, Organizations: United States. New York. Brooklyn. Bureau of Charities: Bureau of Charities, Brooklyn, N.Y.: 69 Shermerhorn Street: Central Day Nursery.

Date

c. 1900

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.862.1

Human Generated Data

Title

Charity, Organizations: United States. New York. Brooklyn. Bureau of Charities: Bureau of Charities, Brooklyn, N.Y.: 69 Shermerhorn Street: Central Day Nursery.

People

Artist: Unidentified Artist,

Date

c. 1900

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.862.1

Machine Generated Data

Tags

Amazon
created on 2019-06-07

Chair 98.5
Furniture 98.5
Person 97.9
Human 97.9
Indoors 97.4
Room 97.4
Person 97.1
Person 96.9
Person 96.1
Person 94.6
Person 93.9
Person 92.6
Person 92.1
Person 92
Classroom 91.2
School 91.2
Person 91.2
Person 89.6
Restaurant 87.4
Person 84.2
Person 82.4
Person 80.7
Chair 71.8
Workshop 70.7
Cafeteria 68.9
People 67.9
Kindergarten 66.9
Dining Room 65.3
Meal 62.5
Food 62.5
Housing 60.9
Building 60.9
Bedroom 58.1
Dining Table 56.4
Table 56.4
Food Court 55.2
Cafe 55.1
Person 48.6

Clarifai
created on 2019-06-07

people 99.9
group 99.2
group together 99
many 98.4
furniture 97.5
adult 97.2
several 96.5
administration 94.8
woman 94.7
leader 92.8
man 92.4
five 92.1
four 90.7
room 90.1
child 89.6
sit 87.9
recreation 85.7
wear 83
three 82.8
education 81.8

Imagga
created on 2019-06-07

teacher 50.8
room 44.4
classroom 38.4
educator 37.7
professional 26.7
person 26.7
people 26.2
adult 25.9
table 23.5
home 22.3
chair 21.3
couple 20.9
male 20.6
man 20.1
restaurant 19.7
interior 19.4
indoors 19.3
women 19
sitting 17.2
men 16.3
family 16
together 15.8
cheerful 15.4
happiness 14.9
happy 13.8
smiling 13.7
house 13.4
dress 12.6
love 12.6
building 12.5
student 12.5
drinking 12.4
lifestyle 12.3
indoor 11.9
two 11.8
drink 11.7
new 11.3
day 11
romantic 10.7
smile 10.7
celebration 10.4
mature 10.2
glass 10.1
holiday 10
dinner 9.8
old 9.7
kin 9.7
group 9.7
color 9.4
furniture 9.4
cafeteria 9.3
relaxation 9.2
inside 9.2
worker 9.1
modern 9.1
holding 9.1
portrait 9.1
casual clothing 8.8
musical instrument 8.7
boy 8.7
decoration 8.7
food 8.7
bride 8.6
married 8.6
school 8.6
business 8.5
enjoyment 8.4
wedding 8.3
teenage 7.7
mother 7.6
talking 7.6
life 7.6
enjoying 7.6
togetherness 7.5
meeting 7.5
friends 7.5
friendship 7.5
groom 7.5
style 7.4
wine 7.4
children 7.3
child 7.1
businessman 7.1

Google
created on 2019-06-07

Microsoft
created on 2019-06-07

table 98.4
furniture 92.4
indoor 91.8
clothing 91.5
person 86.5
white 83.2
chair 81.3
woman 66.5
man 66
old 62
library 61.7
room 41.3
dining table 9.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 4-9
Gender Female, 54.1%
Angry 46.7%
Sad 51.1%
Surprised 45.3%
Confused 45.4%
Happy 45.1%
Calm 46.2%
Disgusted 45.3%

AWS Rekognition

Age 4-9
Gender Female, 51.1%
Angry 53.7%
Sad 46%
Surprised 45%
Confused 45.1%
Happy 45%
Calm 45%
Disgusted 45.1%

AWS Rekognition

Age 4-9
Gender Female, 52.4%
Sad 46.4%
Calm 45.2%
Angry 52.2%
Happy 45.1%
Disgusted 45.1%
Confused 45.6%
Surprised 45.3%

AWS Rekognition

Age 4-7
Gender Female, 53.5%
Confused 45.5%
Angry 45.2%
Surprised 45.2%
Calm 45.7%
Sad 53.3%
Happy 45.1%
Disgusted 45.1%

AWS Rekognition

Age 6-13
Gender Female, 51.7%
Confused 45.5%
Angry 46.4%
Surprised 45.3%
Calm 46.1%
Sad 47.2%
Happy 45.3%
Disgusted 49.1%

AWS Rekognition

Age 4-7
Gender Female, 50.6%
Sad 48.3%
Surprised 46.5%
Disgusted 45.3%
Calm 47.8%
Angry 45.7%
Happy 45.6%
Confused 45.7%

AWS Rekognition

Age 23-38
Gender Female, 50.3%
Calm 45.8%
Happy 46.9%
Surprised 46.5%
Disgusted 47.2%
Sad 46%
Angry 46.8%
Confused 45.8%

AWS Rekognition

Age 10-15
Gender Female, 52.7%
Angry 45.5%
Surprised 45.6%
Sad 49.9%
Disgusted 45.3%
Confused 45.7%
Happy 45.4%
Calm 47.6%

AWS Rekognition

Age 12-22
Gender Female, 50.3%
Sad 49.8%
Angry 49.5%
Happy 49.9%
Disgusted 49.5%
Calm 49.6%
Surprised 49.6%
Confused 49.5%

AWS Rekognition

Age 35-52
Gender Female, 53.3%
Angry 45.5%
Surprised 45.1%
Sad 45.9%
Confused 45.2%
Happy 45.1%
Calm 47.5%
Disgusted 50.7%

AWS Rekognition

Age 6-13
Gender Female, 51.1%
Surprised 45.6%
Happy 45.1%
Calm 52%
Angry 45.7%
Disgusted 45.3%
Sad 45.5%
Confused 45.9%

AWS Rekognition

Age 2-5
Gender Female, 50.4%
Surprised 49.5%
Happy 49.8%
Sad 49.8%
Confused 49.5%
Disgusted 49.5%
Angry 49.5%
Calm 49.8%

AWS Rekognition

Age 20-38
Gender Female, 50.3%
Sad 49.6%
Surprised 49.8%
Calm 49.6%
Disgusted 49.6%
Confused 49.6%
Angry 49.6%
Happy 49.8%

AWS Rekognition

Age 9-14
Gender Female, 50.5%
Happy 49.5%
Angry 49.5%
Surprised 49.5%
Disgusted 49.5%
Confused 49.5%
Calm 49.5%
Sad 50.4%

AWS Rekognition

Age 17-27
Gender Female, 53.9%
Happy 45.4%
Surprised 45.2%
Sad 49.2%
Confused 45.2%
Disgusted 46.4%
Calm 45%
Angry 48.6%

AWS Rekognition

Age 16-27
Gender Female, 50.2%
Surprised 49.6%
Disgusted 49.6%
Sad 49.7%
Angry 49.6%
Calm 49.9%
Happy 49.6%
Confused 49.5%

AWS Rekognition

Age 26-43
Gender Male, 50.4%
Angry 49.8%
Sad 50%
Happy 49.5%
Surprised 49.5%
Disgusted 49.5%
Confused 49.6%
Calm 49.6%

Microsoft Cognitive Services

Age 37
Gender Female

Microsoft Cognitive Services

Age 5
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair 98.5%
Person 97.9%

Categories

Text analysis

Amazon

CODPISF

Google

G CODFISH
G
CODFISH