Human Generated Data

Title

Jobs Returning from the University

Date

19th century

People

Artist: Tamme Weyerd Theodor Janssen, German 1816 - 1894

Artist after: Johann Peter Hasenclever,

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R10348

Human Generated Data

Title

Jobs Returning from the University

People

Artist: Tamme Weyerd Theodor Janssen, German 1816 - 1894

Artist after: Johann Peter Hasenclever,

Date

19th century

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-08-09

Person 98.9
Human 98.9
Person 97.3
Art 96.2
Person 93.8
Painting 90.2
Person 85.5
Person 76.5
Person 68.6
Person 56.1
Person 44.6

Clarifai
created on 2019-08-09

people 99.9
adult 99.5
group 99.3
illustration 98.6
art 98.2
wear 98
woman 97.8
furniture 97.7
two 97.6
man 97.2
painting 95.2
print 94.5
room 93.1
seat 92.8
one 92.1
portrait 91.9
outfit 90.5
veil 88.7
three 88.6
child 86.4

Imagga
created on 2019-08-09

sketch 100
drawing 100
representation 76.8
stamp 38.5
vintage 30.6
envelope 26.7
old 26.5
paper 25.9
retro 25.4
grunge 23.8
mail 20.1
art 18.9
letter 18.4
antique 18.2
ancient 18.2
postage 17.7
postmark 16.8
currency 16.2
design 15.8
money 15.3
symbol 14.8
post 14.3
postal 13.7
aged 13.6
history 13.4
die 13.2
cash 12.8
finance 12.7
texture 12.5
decoration 11.6
card 11.1
note 11
philately 10.9
frame 10.8
container 10.8
bank 10.8
bill 10.5
black 10.2
banking 10.1
global 10
decorative 10
shaping tool 9.9
collection 9.9
sign 9.8
graphic 9.5
wealth 9
financial 8.9
pattern 8.9
printed 8.9
detail 8.9
object 8.8
book jacket 8.7
old fashioned 8.6
dollar 8.4
painting 8.1
brown 8.1
business 7.9
artistic 7.8
bills 7.8
jacket 7.8
dollars 7.7
exchange 7.6
worn 7.6
church 7.4
investment 7.3
artwork 7.3
message 7.3
religion 7.2
wrapping 7.1

Google
created on 2019-08-09

Microsoft
created on 2019-08-09

gallery 99.9
room 99.9
scene 99.9
drawing 99.2
sketch 98.2
person 89.4
art 88.5
painting 88.1
clothing 85.4
text 79.8
white 64.5
cartoon 58.4
woman 56.8
old 55.6
vintage 26

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Male, 51.6%
Confused 45.2%
Calm 48.2%
Fear 50.1%
Sad 45.3%
Surprised 45.6%
Happy 45.3%
Angry 45.2%
Disgusted 45.1%

AWS Rekognition

Age 5-15
Gender Female, 50.5%
Disgusted 49.5%
Angry 49.5%
Fear 50.2%
Calm 49.6%
Happy 49.5%
Confused 49.5%
Surprised 49.5%
Sad 49.6%

AWS Rekognition

Age 15-27
Gender Female, 50%
Disgusted 49.5%
Angry 49.5%
Confused 49.5%
Happy 49.8%
Fear 49.5%
Calm 50.2%
Sad 49.5%
Surprised 49.5%

AWS Rekognition

Age 17-29
Gender Female, 51.5%
Disgusted 45%
Fear 45%
Happy 45.9%
Surprised 45.1%
Angry 45.2%
Confused 45%
Sad 45.3%
Calm 53.5%

AWS Rekognition

Age 6-16
Gender Female, 50.4%
Angry 49.6%
Confused 49.5%
Happy 49.5%
Disgusted 49.5%
Calm 49.6%
Fear 49.7%
Sad 49.5%
Surprised 50.1%

AWS Rekognition

Age 23-35
Gender Female, 50.5%
Angry 49.5%
Surprised 49.6%
Calm 49.5%
Disgusted 49.5%
Sad 49.5%
Fear 50.3%
Happy 49.5%
Confused 49.5%

AWS Rekognition

Age 24-38
Gender Male, 50.1%
Happy 49.5%
Fear 49.5%
Sad 49.6%
Disgusted 49.6%
Angry 50.1%
Calm 49.8%
Surprised 49.5%
Confused 49.5%

AWS Rekognition

Age 7-17
Gender Female, 50.3%
Happy 49.5%
Disgusted 49.5%
Confused 49.5%
Sad 49.5%
Fear 50.3%
Surprised 49.5%
Angry 49.7%
Calm 49.5%

Feature analysis

Amazon

Person 98.9%
Painting 90.2%

Captions

Microsoft

a vintage photo of a person in a white room 70.3%
a vintage photo of a person 70.2%
a vintage photo of a person in a room 70.1%

Text analysis

Amazon

mt
ab bbe mt ilbind Jte
Jte
bbe
ilbind
ab