Human Generated Data

Title

Charity, Children: United States. New York. Pleasantville. Hebrew Sheltering Guardian Society: Hebrew Sheltering Guardian Society Orphan Asylum, Pleasantville, New York: Living rooms in the boys' and girls' cottages.

Date

c. 1900

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.293.2

Human Generated Data

Title

Charity, Children: United States. New York. Pleasantville. Hebrew Sheltering Guardian Society: Hebrew Sheltering Guardian Society Orphan Asylum, Pleasantville, New York: Living rooms in the boys' and girls' cottages.

People

Artist: Unidentified Artist,

Date

c. 1900

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.293.2

Machine Generated Data

Tags

Amazon
created on 2022-06-04

Person 99.5
Human 99.5
Person 99.5
Person 99.4
Person 99
Person 98.5
Person 96.9
Person 96.4
Person 96.1
Person 94.5
Room 94.1
Indoors 94.1
Person 93.8
Clinic 92.2
Person 92
Person 91
Person 84.7
Person 82.3
Bedroom 82
Painting 69.9
Art 69.9
Person 60.9
Hospital 58.1
Dorm Room 57.4
Bed 55.9
Furniture 55.9
Kindergarten 55.8
People 55.1
Workshop 55.1

Imagga
created on 2022-06-04

barbershop 46.4
hospital 41
shop 39.5
mercantile establishment 31.3
home 30.3
people 30.1
man 28.2
adult 26.9
couple 26.1
room 25.8
person 24.2
hairdresser 23.4
male 22.8
happy 21.3
happiness 21.1
indoors 21.1
smiling 21
place of business 20.8
salon 17.8
love 17.4
family 16.9
interior 16.8
sitting 16.3
senior 15.9
together 15.8
smile 15.7
women 15
child 14.6
lifestyle 14.5
indoor 13.7
two 13.5
patient 13.3
portrait 12.9
dress 12.6
pretty 12.6
teacher 12.6
looking 12
wedding 12
nurse 11.8
house 11.7
care 11.5
bride 11.5
husband 11.4
loving 11.4
table 11.2
men 11.2
chair 11.1
day 11
lady 10.5
old 10.4
establishment 10.4
clinic 10.3
life 10.2
casual 10.2
window 10.1
attractive 9.8
cheerful 9.8
kid 9.7
health 9.7
new 9.7
two people 9.7
medical 9.7
mother 9.6
elderly 9.6
comfortable 9.5
bed 9.5
wife 9.5
togetherness 9.4
holiday 9.3
face 9.2
married 8.6
talking 8.6
modern 8.4
mature 8.4
color 8.3
inside 8.3
holding 8.3
occupation 8.2
alone 8.2
groom 8.2
father 8
boy 7.8
casual clothing 7.8
illness 7.6
relaxation 7.5
enjoyment 7.5
human 7.5
worker 7.3
children 7.3
work 7.2
celebration 7.2
educator 7.1
to 7.1
bedroom 7.1
little 7.1

Google
created on 2022-06-04

Microsoft
created on 2022-06-04

clothing 97.3
indoor 97
person 96.4
woman 87.5
dress 71.4
wedding dress 71.2
family 67
table 66.9
old 57.5
several 12.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Female, 99.9%
Sad 100%
Surprised 6.7%
Calm 6.5%
Fear 6%
Angry 4.9%
Confused 1.8%
Disgusted 0.9%
Happy 0.8%

AWS Rekognition

Age 36-44
Gender Female, 100%
Sad 99.8%
Calm 26.3%
Surprised 6.4%
Fear 6.1%
Angry 1.1%
Disgusted 0.5%
Confused 0.3%
Happy 0.2%

AWS Rekognition

Age 27-37
Gender Female, 100%
Calm 84.3%
Sad 13.5%
Surprised 6.4%
Fear 6%
Angry 0.5%
Disgusted 0.4%
Confused 0.3%
Happy 0.2%

AWS Rekognition

Age 20-28
Gender Male, 99.9%
Calm 73.1%
Sad 38.5%
Surprised 6.4%
Fear 6%
Angry 1.1%
Confused 0.7%
Happy 0.7%
Disgusted 0.3%

AWS Rekognition

Age 20-28
Gender Female, 56.6%
Sad 84%
Calm 61.4%
Surprised 6.3%
Fear 5.9%
Confused 0.1%
Angry 0.1%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 24-34
Gender Male, 96.1%
Calm 51.7%
Fear 15.1%
Surprised 8.3%
Disgusted 7.8%
Sad 6.4%
Confused 4.8%
Happy 4.8%
Angry 3.8%

AWS Rekognition

Age 18-24
Gender Female, 75.7%
Calm 76.8%
Surprised 6.7%
Confused 6.4%
Fear 6%
Happy 5.6%
Sad 5.2%
Disgusted 2.8%
Angry 0.6%

AWS Rekognition

Age 6-16
Gender Female, 91.6%
Calm 92.8%
Surprised 7.2%
Fear 6.2%
Sad 3%
Confused 0.9%
Happy 0.5%
Angry 0.3%
Disgusted 0.3%

AWS Rekognition

Age 16-24
Gender Male, 95.9%
Calm 41.3%
Happy 37%
Disgusted 10.2%
Fear 7.3%
Surprised 7.2%
Sad 3.1%
Angry 1.8%
Confused 1.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.5%
Painting 69.9%

Categories

Text analysis

Amazon

CIP

Google

CHED