Human Generated Data

Title

Scene at a Well, (probably from the Legend of Hagar)

Date

17th century

People

Artist: Unidentified Artist,

Previous attribution: S├ębastien Bourdon, French 1616 - 1671

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Francis H. Burr Memorial Fund, 1938.114

Human Generated Data

Title

Scene at a Well, (probably from the Legend of Hagar)

People

Artist: Unidentified Artist,

Previous attribution: S├ębastien Bourdon, French 1616 - 1671

Date

17th century

Classification

Paintings

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Person 99.8
Human 99.8
Person 99.6
Person 99.3
Person 99.1
Person 98.7
Person 98.2
Person 98
Art 97.2
Painting 97.2
Person 94.1
Person 76.6
Photography 61.4
Portrait 61.4
Photo 61.4
Face 61.4

Clarifai
created on 2020-04-24

people 100
group 99.7
adult 99.2
child 98.5
man 97.3
many 97.2
group together 97.2
woman 95.9
print 93.6
military 92
furniture 91.9
several 91.8
sit 91.1
war 90.7
boy 90.6
home 88.6
wear 87.9
art 87.2
administration 86.9
soldier 86.8

Imagga
created on 2020-04-24

kin 48.5
child 27.6
people 21.8
person 18.9
world 18.3
man 18.1
adult 16.9
sunset 15.3
silhouette 14.1
portrait 12.9
old 12.5
happiness 12.5
happy 12.5
mother 12.5
parent 12
male 11.6
outdoor 11.5
love 11
water 10.7
couple 10.5
sport 10.3
dark 10
ancient 9.5
stone 9.5
walking 9.5
architecture 9.4
outdoors 9.1
black 9
family 8.9
light 8.8
lifestyle 8.7
culture 8.5
face 8.5
serene 8.5
park 8.2
fun 8.2
active 8.1
history 8
art 8
life 8
boy 7.8
model 7.8
men 7.7
youth 7.7
sky 7.7
statue 7.6
beach 7.6
sibling 7.6
field 7.5
father 7.5
leisure 7.5
holding 7.4
tourism 7.4
vacation 7.4
building 7.3
peaceful 7.3
dirty 7.2
religion 7.2

Google
created on 2020-04-24

Photograph 95.3
Painting 93.4
People 93.1
Art 81.3
Stock photography 76.5
Visual arts 68.6
Photography 67.8
Artwork 57.4
Child 57.3
Black-and-white 56.4
Illustration 54.5
History 54.1
Family 50.2

Microsoft
created on 2020-04-24

painting 97.6
outdoor 96.9
text 93.4
drawing 91.2
person 90.7
clothing 88.2
man 70.9
old 68.4
group 61
sketch 58.5
posing 42.2

Face analysis

Amazon

Google

AWS Rekognition

Age 31-47
Gender Male, 52.9%
Surprised 47.3%
Confused 45.2%
Happy 45.2%
Angry 48.5%
Calm 46.5%
Fear 46.8%
Disgusted 45.1%
Sad 45.3%

AWS Rekognition

Age 23-35
Gender Female, 54.4%
Confused 45.1%
Happy 45.1%
Disgusted 45%
Calm 54.3%
Fear 45%
Surprised 45.2%
Sad 45.3%
Angry 45%

AWS Rekognition

Age 6-16
Gender Female, 54.1%
Sad 46.2%
Disgusted 45%
Angry 45.1%
Fear 45.2%
Surprised 45.1%
Happy 45.3%
Calm 53%
Confused 45.1%

AWS Rekognition

Age 12-22
Gender Male, 54.8%
Happy 45%
Fear 45%
Calm 52.6%
Angry 45.1%
Confused 45%
Surprised 45%
Disgusted 45%
Sad 47.3%

AWS Rekognition

Age 36-54
Gender Male, 54.7%
Disgusted 46.3%
Sad 47.5%
Angry 46.5%
Confused 45.4%
Calm 47.5%
Happy 46.3%
Surprised 45.4%
Fear 45.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Painting 97.2%

Captions

Microsoft

a group of people posing for a photo 89.4%
an old photo of a group of people posing for the camera 88.2%
a group of people in an old photo of a person 86.6%