Human Generated Data

Title

The Finding of Moses

Date

18th century

People

Artist: Benoit Louis Henriquez, French 1732 - 1806

Artist after: Paolo Caliari, called Veronese, Italian 1528 - 1588

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G1794

Human Generated Data

Title

The Finding of Moses

People

Artist: Benoit Louis Henriquez, French 1732 - 1806

Artist after: Paolo Caliari, called Veronese, Italian 1528 - 1588

Date

18th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G1794

Machine Generated Data

Tags

Amazon
created on 2019-06-17

Person 99.2
Human 99.2
Painting 98.2
Art 98.2
Person 98.1
Person 96.1
Person 93
Person 90.2
Person 88.6
Face 58
Portrait 58
Photography 58
Photo 58

Clarifai
created on 2019-06-17

people 100
group 99.7
adult 98.7
woman 98.5
child 98.1
art 97.7
man 97.1
print 96.9
family 96.1
portrait 95.5
two 93
offspring 92.5
three 92.2
baby 91.3
affection 91.1
wear 90.7
four 90.2
son 88.8
music 88.8
recreation 85.7

Imagga
created on 2019-06-17

kin 54.7
portrait 23.9
child 20
man 19.5
people 16.7
love 15.8
couple 15.7
face 15.6
happy 15
male 14.2
mother 13.7
parent 13.6
family 13.3
statue 13.1
adult 13
sculpture 12.8
old 12.5
attractive 11.9
newspaper 11.7
world 11.6
home 11.2
culture 11.1
hair 11.1
smiling 10.8
vintage 10.8
antique 10.5
product 10.3
close 10.3
money 10.2
happiness 10.2
sibling 10.1
art 9.4
relationship 9.4
cute 9.3
cash 9.1
creation 9.1
black 9
sexy 8.8
together 8.8
person 8.7
women 8.7
boy 8.7
head 8.4
pretty 8.4
one 8.2
sketch 8.2
decoration 8.2
aged 8.1
lady 8.1
currency 8.1
lifestyle 7.9
room 7.9
ancient 7.8
sitting 7.7
youth 7.7
elderly 7.7
two 7.6
fashion 7.5
banking 7.4
girls 7.3
looking 7.2
religion 7.2
bank 7.2
romantic 7.1
kid 7.1
sofa 7

Google
created on 2019-06-17

Microsoft
created on 2019-06-17

text 99.8
painting 92.3
person 87
old 85
clothing 84
window 83.2
drawing 82.2
man 50.9
posing 50
vintage 46.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 11-18
Gender Female, 54.8%
Happy 45%
Surprised 45%
Angry 45%
Disgusted 45%
Calm 45.1%
Sad 54.7%
Confused 45%

AWS Rekognition

Age 20-38
Gender Female, 54.2%
Confused 45.2%
Calm 46.1%
Angry 45.3%
Surprised 45.1%
Happy 45.3%
Sad 52.8%
Disgusted 45.2%

AWS Rekognition

Age 48-68
Gender Male, 50.7%
Happy 47%
Angry 45.2%
Calm 49.2%
Sad 48.1%
Disgusted 45.1%
Surprised 45.1%
Confused 45.2%

AWS Rekognition

Age 26-43
Gender Female, 52.2%
Surprised 45.4%
Sad 48.8%
Disgusted 45.2%
Confused 45.2%
Happy 45.1%
Angry 46.3%
Calm 48.9%

AWS Rekognition

Age 10-15
Gender Female, 53.8%
Happy 45%
Sad 54.4%
Angry 45.1%
Confused 45.1%
Surprised 45.1%
Disgusted 45%
Calm 45.3%

AWS Rekognition

Age 12-22
Gender Female, 51.1%
Angry 45.8%
Happy 45.2%
Calm 51.1%
Sad 47%
Surprised 45.3%
Disgusted 45.1%
Confused 45.4%

AWS Rekognition

Age 35-52
Gender Female, 54.1%
Sad 45.2%
Calm 51.4%
Happy 47.7%
Disgusted 45.1%
Surprised 45.3%
Confused 45.2%
Angry 45.1%

AWS Rekognition

Age 14-23
Gender Female, 54.7%
Confused 45.1%
Surprised 45.4%
Disgusted 45%
Happy 45%
Sad 51.5%
Angry 45.3%
Calm 47.7%

Microsoft Cognitive Services

Age 25
Gender Female

Microsoft Cognitive Services

Age 25
Gender Female

Microsoft Cognitive Services

Age 27
Gender Female

Microsoft Cognitive Services

Age 26
Gender Female

Microsoft Cognitive Services

Age 27
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Painting 98.2%

Categories

Imagga

paintings art 79.2%
pets animals 20.7%