Human Generated Data

Title

Theatrical Scene (?)

Date

1701-1799

People

Artist: Unidentified Artist,

Previous attribution: Jacques-François Courtin, French 1672 - 1752

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Grenville L. Winthrop, Class of 1886, 1939.110

Human Generated Data

Title

Theatrical Scene (?)

People

Artist: Unidentified Artist,

Previous attribution: Jacques-François Courtin, French 1672 - 1752

Date

1701-1799

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Grenville L. Winthrop, Class of 1886, 1939.110

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Human 99.4
Person 99.4
Person 98.7
Painting 98.6
Art 98.6
Portrait 66.6
Photography 66.6
Photo 66.6
Face 66.6

Clarifai
created on 2020-04-24

people 100
group 99.7
leader 99.2
adult 98.8
print 98.1
two 98
three 97.5
man 97
monarch 96.5
royalty 96.2
portrait 96.1
woman 95.4
art 94
offspring 93.6
five 93.3
engraving 93.3
four 92.6
painting 92.1
writer 91.6
furniture 90.6

Imagga
created on 2020-04-24

newspaper 26.8
window 24.5
product 21.4
vintage 19
windowsill 18.7
creation 17.8
one 17.2
sill 17
money 17
framework 16.9
black 16.8
art 15.9
cash 15.5
old 14.6
currency 14.3
window screen 14
dollar 13.9
postmark 13.8
postage 13.7
postal 13.7
culture 13.7
envelope 13.6
stamp 13.5
mail 13.4
business 13.3
screen 13.3
structural member 12.9
letter 12.8
man 12.8
financial 12.5
portrait 12.3
printed 11.8
paintings 11.7
museum 11.6
us 11.5
symbol 11.4
office 11.1
banking 11
finance 11
kin 10.9
masterpiece 10.9
known 10.8
shows 10.8
renaissance 10.8
support 10.7
face 10.6
retro 10.6
people 10.6
supporting structure 10.6
painted 10.5
post 10.5
fine 10.5
person 10.5
television 10.4
unique 10.4
ancient 10.4
savings 10.2
paper 10.2
post mail 9.9
zigzag 9.9
fame 9.9
bank 9.8
painter 9.8
delivery 9.7
close 9.7
cutting 9.6
dollars 9.6
communications 9.6
closeup 9.4
mother 9.4
protective covering 9.3
male 9.2
global 9.1
aged 9
wealth 9
religion 9
circa 8.9
icon 8.7
wall 8.7
loan 8.6
capital 8.5
world 8.3
antique 8.2
grandma 8.2
frame 7.9
banknotes 7.8
pay 7.7
device 7.5
child 7.5
house 7.5
rich 7.4
telecommunication system 7.4
church 7.4
structure 7.2
history 7.1
family 7.1
women 7.1
temple 7.1

Google
created on 2020-04-24

Microsoft
created on 2020-04-24

human face 98.7
person 98.1
clothing 97.5
text 95.6
smile 92.2
woman 88.7
old 76.3
posing 70.1
museum 69.1
picture frame 36

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 5-15
Gender Female, 63.3%
Happy 2.1%
Angry 0.8%
Calm 92.8%
Disgusted 0.2%
Surprised 1.2%
Confused 0.5%
Fear 0.4%
Sad 2%

AWS Rekognition

Age 9-19
Gender Female, 97.4%
Confused 5.1%
Calm 39.5%
Disgusted 0.6%
Surprised 9.2%
Sad 0.8%
Happy 42.5%
Angry 1.5%
Fear 0.7%

Microsoft Cognitive Services

Age 28
Gender Female

Microsoft Cognitive Services

Age 13
Gender Female

Google Vision

Surprise Unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Painting 98.6%

Categories

Captions

Microsoft
created on 2020-04-24

a vintage photo of a person 86.8%
an old photo of a person 86.6%
a vintage photo of a girl 71.1%

Text analysis

Google

υυσο
υυσο