Human Generated Data

Title

Do You Want Me?

Date

c. 1900

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Kate, Maurice R. and Melvin R. Seiden Purchase Fund for Photographs, P2002.17

Human Generated Data

Title

Do You Want Me?

People

Artist: Unidentified Artist,

Date

c. 1900

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Furniture 99.8
Poster 77.2
Advertisement 77.2
Person 66.9
Human 66.9
Cradle 57.4
Person 43.9

Imagga
created on 2021-12-15

currency 43.1
money 41.7
cash 33.9
finance 33
container 32.2
paper 31.9
bank 31.4
dollar 30.6
wealth 27.8
banking 27.6
business 24.9
exchange 24.8
bill 23.8
financial 23.2
investment 22
old 20.9
savings 19.6
note 19.3
payment 19.2
banknote 18.4
pay 18.2
envelope 17.9
box 17.7
dollars 17.4
hundred 15.5
grunge 15.3
carton 14.9
bills 14.6
antique 13.8
loan 13.4
vintage 13.3
market 13.3
rich 13
banknotes 12.7
notes 12.5
retro 12.3
ancient 12.1
economy 12.1
sign 12
book 11.1
texture 11.1
notebook 10.9
us 10.6
profit 10.5
success 10.5
one 10.5
capital 10.4
history 9.8
twenty 9.8
wall 9.7
close 9.7
debt 9.6
card 9.6
coin 9.5
blank 9.4
symbol 9.4
commerce 9.3
design 9.1
salary 8.8
deposit 8.8
change 8.7
culture 8.5
buy 8.5
number 8.4
house 8.4
sale 8.3
shopping 8.3
art 7.9
passport 7.9
monetary 7.8
paying 7.8
states 7.7
pound 7.7
decoration 7.7
stock 7.5
page 7.4
aged 7.2
stamp 7.2
home 7.2

Google
created on 2021-12-15

Baby 80.2
Toddler 75
Baby carriage 71.6
Vintage clothing 64.5
Room 64.2
Font 63.7
Motor vehicle 62.2
Baby Products 62.2
Logo 60.4
Paper product 60.3
Advertising 60.2
Brand 59.6
History 59.5
Classic 58.3
Sitting 54.5
Visual arts 53.2
Suit 53.2
Paper 53.2
Graphics 52.4
Rectangle 52.2

Microsoft
created on 2021-12-15

text 95.8
person 90.7
human face 90.2
baby 89.1
clothing 78.8
toddler 74.6
old 52.8
clothes 20.7

Face analysis

Amazon

Google

AWS Rekognition

Age 0-3
Gender Female, 57.4%
Calm 79.3%
Happy 5.5%
Sad 5.4%
Confused 3.4%
Angry 3.2%
Fear 1.5%
Disgusted 0.9%
Surprised 0.8%

AWS Rekognition

Age 30-46
Gender Female, 63.5%
Calm 75.9%
Surprised 10.1%
Confused 4.5%
Angry 3.2%
Fear 1.9%
Disgusted 1.6%
Sad 1.6%
Happy 1.3%

AWS Rekognition

Age 20-32
Gender Female, 70.4%
Calm 92.2%
Surprised 4.4%
Angry 1.7%
Sad 0.5%
Fear 0.5%
Happy 0.3%
Disgusted 0.2%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 66.9%

Captions

Microsoft

a vintage photo of a suitcase 38.5%
a vintage photo of some clothes 38.4%
a vintage photo of a person 38.3%

Text analysis

Amazon

ME?
WANT ME?
WANT
YOU
Do YOU
Do

Google

DO
You
HE?
DO You WANT HE?
WANT