Human Generated Data

Title

Untitled (maquette of dancers for carte-de-visite)

Date

c. 1865

People

Artist: Jeremiah Gurney & Son, American active 1840s-1890s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Fund for the Acquisition of Photographs, P1998.50

Human Generated Data

Title

Untitled (maquette of dancers for carte-de-visite)

People

Artist: Jeremiah Gurney & Son, American active 1840s-1890s

Date

c. 1865

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 96.3
Human 96.3
Person 91.3
Buckle 60.9
Text 58.3
Person 43.2

Imagga
created on 2022-01-09

apron 93.8
protective garment 75.4
clothing 68.8
consumer goods 41.2
covering 40.7
person 19.8
commodity 18.8
fashion 18.1
people 17.8
business 17
adult 16.1
holding 15.7
black 14.9
attractive 14
top 13.7
portrait 12.9
lace 12.2
face 12.1
man 11.5
happy 11.3
human 11.2
male 10.6
lady 10.5
clothes 10.3
expression 10.2
paper 10.2
professional 10.1
model 10.1
suit 10
worker 9.8
one 9.7
success 9.6
office 9.6
work 9.6
brunette 9.6
smiling 9.4
money 9.3
casual 9.3
pink 9.2
pretty 9.1
dress 9
standing 8.7
jacket 8.6
gift 8.6
post 8.6
smile 8.5
card 8.5
studio 8.3
traditional 8.3
box 8.1
garment 7.9
embroidery 7.9
design 7.9
happiness 7.8
blank 7.7
modern 7.7
culture 7.7
finance 7.6
sign 7.5
note 7.3
against 7.3
friendly 7.3
present 7.3
businesswoman 7.3
currency 7.2
cute 7.2
job 7.1
businessman 7.1
art 7

Google
created on 2022-01-09

Sleeve 87.2
Ball 81.2
Art 81.2
Font 80.2
Jewellery 74.8
Circle 74.7
Pattern 74.7
Rectangle 74.3
Symmetry 72.2
T-shirt 71.1
Fashion accessory 67.6
Visual arts 63.2
Light fixture 61.8
Paper product 61.7
Triangle 61.4
Illustration 59.5
Paper 57.8
Symbol 55.3
Pattern 55.1
Craft 55

Microsoft
created on 2022-01-09

text 98.3

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Female, 100%
Calm 85.4%
Angry 3.8%
Sad 3.3%
Happy 2.2%
Surprised 1.9%
Fear 1.4%
Confused 1.3%
Disgusted 0.8%

AWS Rekognition

Age 24-34
Gender Female, 99.9%
Calm 73.1%
Fear 7.5%
Sad 7.2%
Happy 5.4%
Angry 2.8%
Confused 2.1%
Surprised 1.1%
Disgusted 0.8%

AWS Rekognition

Age 21-29
Gender Female, 100%
Sad 36.3%
Calm 26.4%
Fear 19.6%
Angry 8.5%
Disgusted 3.7%
Surprised 3.2%
Happy 1.8%
Confused 0.5%

AWS Rekognition

Age 21-29
Gender Female, 78.7%
Calm 81.5%
Sad 5.6%
Fear 4.6%
Confused 3.2%
Angry 1.8%
Surprised 1.5%
Disgusted 1.3%
Happy 0.6%

AWS Rekognition

Age 22-30
Gender Male, 96.5%
Calm 98.6%
Disgusted 0.7%
Confused 0.2%
Angry 0.1%
Happy 0.1%
Surprised 0.1%
Sad 0.1%
Fear 0.1%

AWS Rekognition

Age 23-31
Gender Female, 95.5%
Calm 96.8%
Happy 0.8%
Fear 0.6%
Confused 0.6%
Sad 0.5%
Surprised 0.4%
Disgusted 0.1%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.3%

Captions

Microsoft

a close up of a logo 74.5%
a close up of a piece of paper 62.6%
close up of a logo 62.5%