Human Generated Data

Title

Charity, Children: United States. New York. Pleasantville. Hebrew Sheltering Guardian Society: Hebrew Sheltering Guardian Society Orphan Asylum, Pleasantville, New York: Boys are taught woodwork, machine work, electric shop work and other trades.

Date

c. 1900

People

Artist: Wolfson, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.346.1

Human Generated Data

Title

Charity, Children: United States. New York. Pleasantville. Hebrew Sheltering Guardian Society: Hebrew Sheltering Guardian Society Orphan Asylum, Pleasantville, New York: Boys are taught woodwork, machine work, electric shop work and other trades.

People

Artist: Wolfson, American

Date

c. 1900

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.346.1

Machine Generated Data

Tags

Amazon
created on 2022-06-25

Person 99.6
Human 99.6
Person 99.3
Person 99.3
Person 99.2
Person 99.2
Person 98.6
Person 98
Person 96.6
Person 94.3
Building 87
Lab 84.2
Chair 80.8
Furniture 80.8
Room 76.8
Indoors 76.8
Person 73.1
Factory 68.6
Cafeteria 68.1
Restaurant 68.1
Worker 63.6
Workshop 62.5

Imagga
created on 2022-06-25

musical instrument 33.9
percussion instrument 33.6
interior 32.7
room 31
marimba 23.6
home 23.1
table 21.9
modern 21.7
indoors 20.2
chair 20.2
furniture 18.5
house 18.4
kitchen 17.4
people 17.3
decor 15.9
counter 15.9
office 15.6
work 14.9
man 14.8
glass 14
business 14
person 13.9
luxury 13.7
male 13.5
light 13.4
architecture 13.3
smiling 13
device 12.8
indoor 12.8
lifestyle 12.3
inside 12
women 11.9
comfortable 11.4
design 11.2
apartment 10.5
sitting 10.3
restaurant 10.1
happy 10
wood 10
life 9.7
job 9.7
stringed instrument 9.7
desk 9.7
hospital 9.7
cabinet 8.9
businessman 8.8
decoration 8.7
happiness 8.6
lamp 8.6
elegant 8.6
vibraphone 8.5
stage 8.4
steel 8.3
holding 8.2
style 8.2
working 7.9
equipment 7.9
together 7.9
food 7.9
worker 7.9
shop 7.7
men 7.7
two 7.6
elegance 7.6
window 7.5
floor 7.4
executive 7.4
adult 7.3
group 7.2
color 7.2
professional 7.2
hall 7
wooden 7

Google
created on 2022-06-25

Microsoft
created on 2022-06-25

wall 99.6
indoor 99.2
person 95.2
clothing 83.9
furniture 76
man 60.9
text 58.3
table 36.7
several 13.5
worktable 11.9
desk 8.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 18-26
Gender Male, 100%
Calm 44.7%
Confused 35.8%
Sad 14.5%
Surprised 6.8%
Fear 6.2%
Angry 1.4%
Disgusted 1.4%
Happy 0.6%

AWS Rekognition

Age 22-30
Gender Male, 89.7%
Sad 100%
Calm 7.7%
Surprised 6.5%
Fear 6.1%
Angry 3.8%
Confused 1.5%
Happy 0.7%
Disgusted 0.4%

AWS Rekognition

Age 18-24
Gender Male, 99.2%
Calm 59.4%
Sad 25%
Confused 8.1%
Surprised 7.5%
Angry 7.3%
Fear 6.4%
Disgusted 2.1%
Happy 0.5%

AWS Rekognition

Age 19-27
Gender Male, 100%
Calm 98.5%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Confused 0.3%
Angry 0.2%
Happy 0.2%
Disgusted 0.1%

AWS Rekognition

Age 23-31
Gender Female, 99.7%
Calm 98.2%
Surprised 6.4%
Fear 5.9%
Sad 2.3%
Happy 0.5%
Angry 0.2%
Disgusted 0.1%
Confused 0%

AWS Rekognition

Age 16-22
Gender Male, 60.2%
Calm 96.6%
Surprised 6.3%
Fear 5.9%
Sad 2.7%
Happy 0.9%
Confused 0.2%
Angry 0.2%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.6%
Chair 80.8%

Categories

Imagga

interior objects 97.6%
food drinks 2.1%

Text analysis

Amazon

48

Google

ЧХ