Human Generated Data

Title

Untitled (woman in riding helmet with cup of tea, Pennsylvania)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11691

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman in riding helmet with cup of tea, Pennsylvania)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11691

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.2
Human 99.2
Clothing 98.3
Apparel 98.3
Hat 97.9
Hardhat 90.1
Helmet 90.1
Pottery 87.7
Saucer 67.9
Cup 66.2
Meal 60.7
Food 60.7

Clarifai
created on 2023-10-26

people 99.8
adult 99.1
man 96.6
monochrome 95.1
one 94.2
industry 94
two 91.8
group 91.5
lid 90.7
cooking 90.2
indoors 88.6
employee 86.9
chef 86.8
group together 86.6
wear 85.8
furniture 83.6
portrait 83.5
room 82.6
restaurant 82.5
artisan 80.1

Imagga
created on 2022-01-15

man 38.4
person 37.2
waiter 31.5
seller 29.3
male 29.1
people 25.1
lab coat 24.3
adult 24.2
home 23.1
dining-room attendant 22.2
worker 21.5
coat 21
employee 19.9
sitting 19.7
work 19.6
indoors 19.3
happy 18.8
senior 18.7
kitchen 18.1
food 17.5
smiling 17.4
table 16.6
30s 16.4
professional 16.2
job 15.9
restaurant 15.3
working 15
medical 15
men 14.6
business 14.6
lifestyle 14.5
plate 14.3
doctor 14.1
clothing 13
patient 13
holding 12.4
cooking 12.2
couple 12.2
office 11.3
mature 11.2
occupation 11
20s 11
drink 10.9
chef 10.7
businessman 10.6
elderly 10.5
together 10.5
computer 10.4
portrait 10.4
hospital 10.3
indoor 10
businesswoman 10
hand 9.9
cheerful 9.8
health 9.7
dinner 9.4
casual 9.3
coffee 9.3
wine 9.2
inside 9.2
white 9.2
cook 9.1
lunch 8.9
color 8.9
clinic 8.9
medicine 8.8
desk 8.7
nurse 8.7
mid adult 8.7
busy 8.7
talking 8.6
smile 8.6
adults 8.5
two 8.5
attractive 8.4
hat 8.3
meal 8.2
laptop 8.2
uniform 8.1
garment 8
to 8
interior 8
women 7.9
shop 7.9
bright 7.9
day 7.8
preparing 7.8
two people 7.8
modern 7.7
retirement 7.7
old 7.7
wok 7.7
profession 7.7
businesspeople 7.6
eating 7.6
focus 7.4
confident 7.3
student 7.2
suit 7.2
handsome 7.1
face 7.1
happiness 7.1
course 7
glass 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 97.2
person 95.8
black and white 94.6
fashion accessory 79.8
clothing 78.2
hat 74.5
monochrome 69.8
old 46.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Female, 79.4%
Calm 98.6%
Sad 1.3%
Angry 0%
Confused 0%
Surprised 0%
Disgusted 0%
Fear 0%
Happy 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Hat 97.9%

Text analysis

Amazon

115
11525
115 25.
25.
11525..
11525 --KODA
: 50
--KODA

Google

11525. Doa Wu HCRESA
11525.
Wu
Doa
HCRESA