Human Generated Data

Title

Head of a Model

Date

c. 1878

People

Artist: John Singer Sargent, American 1856 - 1925

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Miss Emily Sargent and Mrs. Francis Ormond in memory of their brother, John Singer Sargent, 1931.78

Human Generated Data

Title

Head of a Model

People

Artist: John Singer Sargent, American 1856 - 1925

Date

c. 1878

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Miss Emily Sargent and Mrs. Francis Ormond in memory of their brother, John Singer Sargent, 1931.78

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Art 100
Face 99.8
Head 99.8
Photography 99.8
Portrait 99.8
Drawing 99.8
Person 97.2
Adult 97.2
Bride 97.2
Female 97.2
Wedding 97.2
Woman 97.2
Painting 97.2

Clarifai
created on 2018-05-10

paper 95.8
people 94.3
blank 93.7
desktop 93.7
art 93.2
illustration 92.9
man 92.5
portrait 90.5
hand 87.7
bill 87.6
adult 87.6
person 84.9
design 84.3
symbol 83.9
one 82.7
retro 82.4
empty 82.4
head 81.4
message 79.6
sign 79.6

Imagga
created on 2023-10-05

representation 60.3
sketch 51.8
mug shot 48.5
photograph 39.5
paper 36.9
drawing 36.9
blank 25.9
envelope 23.6
money 23
business 21.9
creation 19.7
cash 19.2
currency 18.9
sign 18.8
dollar 16.7
message 16.4
empty 16.3
container 16.2
bank 15.9
note 15.6
design 15.2
finance 15.2
card 14.8
object 14.7
symbol 14.1
bill 13.3
banking 12.9
wealth 12.6
post 12.4
office 12
hundred 11.6
us 11.6
financial 10.7
write 10.4
close 10.3
savings 10.3
one 9.7
dollars 9.7
pay 9.6
jersey 9.6
page 9.3
document 9.3
letter 9.2
frame 9.2
old 9.1
pattern 8.9
banknotes 8.8
closeup 8.8
label 8.4
vintage 8.3
investment 8.3
banknote 7.8
gift 7.7
price 7.7
loan 7.7
texture 7.6
exchange 7.6
retail 7.6
sheet 7.5
rich 7.4
element 7.4
economy 7.4
style 7.4
black 7.3
shape 7.3
shirt 7.2
board 7.2
collection 7.2
shopping 7.2
box 7.2
shadow 7.2
market 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 19-27
Gender Male, 100%
Calm 94.7%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Confused 2.2%
Disgusted 2.1%
Angry 0.3%
Happy 0.1%

Microsoft Cognitive Services

Age 36
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.2%
Adult 97.2%
Bride 97.2%
Female 97.2%
Woman 97.2%

Categories

Captions

Microsoft
created on 2018-05-10

a close up of a mans face 86.1%
close up of a mans face 82.5%
a close up of a persons face 82.4%