Human Generated Data

Title

Untitled (artist sketching woman on street)

Date

1940s

People

Artist: Mary Lowber Tiers, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1352

Human Generated Data

Title

Untitled (artist sketching woman on street)

People

Artist: Mary Lowber Tiers, American active 1940s

Date

1940s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Sitting 99.8
Human 99.8
Person 99.3
Person 98.8
Clothing 97
Apparel 97
Art 91.1
Furniture 90.1
Drawing 84.9
Sketch 82.2
Chair 81.5
Person 79.9
Flooring 63.9
Crowd 60.7
Shoe 59.3
Footwear 59.3
Canvas 57.7

Imagga
created on 2022-01-22

hairdresser 54.1
man 41
people 33.5
male 33.3
person 31.7
adult 24.6
sitting 22.3
barbershop 22
indoors 22
men 20.6
room 20.5
senior 19.7
office 18.5
happy 18.2
professional 18.1
smiling 18.1
shop 17.9
medical 17.7
business 17.6
businessman 16.8
casual 16.1
hospital 16
doctor 16
home 16
patient 15.9
mature 15.8
smile 15.7
indoor 15.5
computer 15.2
lifestyle 15.2
worker 14.1
table 13.8
looking 13.6
mercantile establishment 13.4
family 13.3
two 12.7
work 12.6
job 12.4
talking 12.4
businesspeople 12.3
desk 12.3
couple 12.2
scholar 12
women 11.9
businesswoman 11.8
day 11.8
working 11.5
elderly 11.5
together 11.4
health 11.1
salon 11
portrait 11
occupation 11
clinic 11
happiness 11
colleagues 10.7
seller 10.5
color 10
holding 9.9
painter 9.8
50s 9.8
classroom 9.8
30s 9.6
intellectual 9.6
standing 9.6
meeting 9.4
nurse 9.4
20s 9.2
attractive 9.1
old 9.1
aged 9.1
place of business 9
group 8.9
to 8.8
medicine 8.8
chair 8.8
40s 8.8
mid adult 8.7
education 8.7
retirement 8.6
profession 8.6
daughter 8.6
corporate 8.6
bright 8.6
horizontal 8.4
care 8.2
coat 8.2
mother 7.9
black 7.8
face 7.8
two people 7.8
retired 7.8
busy 7.7
illness 7.6
camera 7.4
cheerful 7.3
school 7.2
interior 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 99.8
text 97.3
clothing 94.8
sitting 94.3
furniture 85
footwear 80.9
black and white 78.5
drawing 76.1
woman 63.8
handwriting 52.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 18-26
Gender Female, 99%
Calm 91%
Sad 5.7%
Angry 1.8%
Confused 0.7%
Disgusted 0.3%
Surprised 0.2%
Happy 0.2%
Fear 0.2%

AWS Rekognition

Age 31-41
Gender Female, 99.9%
Calm 98.4%
Sad 0.6%
Happy 0.5%
Disgusted 0.2%
Angry 0.1%
Fear 0.1%
Confused 0.1%
Surprised 0.1%

AWS Rekognition

Age 24-34
Gender Female, 56.6%
Calm 95.3%
Fear 2.9%
Sad 0.7%
Surprised 0.4%
Angry 0.3%
Confused 0.2%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 23-33
Gender Female, 95.3%
Calm 58.3%
Angry 17%
Disgusted 15.5%
Sad 2.3%
Fear 2.2%
Surprised 2.1%
Confused 1.5%
Happy 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Chair 81.5%
Shoe 59.3%

Captions

Microsoft

a man and a woman sitting on a bench 71.5%
a person sitting on a bench 71.4%
a man and woman sitting on a bench 67.1%

Text analysis

Amazon

CITRUS
CITRUS FRUIT
FRUIT
INDIAN
INDIAN RIVE
RIVE
THE
any

Google

CITRUS
CITRUS FRUIT
FRUIT