Human Generated Data

Title

Untitled (women at Christmas fair booth)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4485

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (women at Christmas fair booth)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4485

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.8
Human 99.8
Person 99.7
Person 99.5
Person 99.2
Person 94.6
Person 84.9
Clothing 77.8
Apparel 77.8
Text 77.6
People 72
Leisure Activities 63.6
Meal 63
Food 63
Bazaar 56.7
Shop 56.7
Market 56.7
Girl 56.7
Female 56.7
Crowd 56.1

Clarifai
created on 2023-10-26

people 99.5
man 97.1
group 97.1
adult 96.6
woman 96.2
wear 89.6
music 89
child 87.3
monochrome 86.9
sit 85.6
group together 85
retro 84.2
musician 83.1
adolescent 82
boy 78.9
sitting 78.7
nostalgia 78.1
war 77.2
many 75.4
furniture 72.2

Imagga
created on 2022-01-23

people 21.7
silhouette 19.9
male 19.9
business 19.4
newspaper 19
men 18
person 17.8
product 17
man 16.8
slick 15.9
art 15.1
creation 14.9
retro 14.7
symbol 14.1
drawing 14.1
design 14.1
black 13.8
grunge 13.6
businessman 13.2
paper 12.5
old 12.5
vintage 12.4
banking 11.9
finance 11.8
group 11.3
graphic 10.9
bank 10.9
financial 10.7
women 10.3
youth 10.2
dollar 10.2
money 10.2
daily 10.1
sport 10
team 9.9
sign 9.8
success 9.6
poster 9.4
player 9.2
cash 9.1
idea 8.9
us 8.7
education 8.7
one 8.2
style 8.2
currency 8.1
blackboard 7.9
student 7.7
chart 7.6
two 7.6
hand 7.6
power 7.6
happy 7.5
human 7.5
dance 7.5
letter 7.4
sketch 7.4
building 7.3
work 7.2
music 7.2
game 7.2

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.9
drawing 95.6
cartoon 94.9
person 93.1
clothing 88.4
posing 79.7
man 77.8
sketch 67.3
black and white 55.8
old 51.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 99.8%
Happy 78.4%
Calm 11%
Surprised 4.4%
Sad 2%
Confused 1.6%
Disgusted 1.2%
Angry 1%
Fear 0.4%

AWS Rekognition

Age 33-41
Gender Female, 82.1%
Calm 88.6%
Sad 5.8%
Happy 2.3%
Angry 1.3%
Surprised 1.1%
Confused 0.5%
Disgusted 0.3%
Fear 0.1%

AWS Rekognition

Age 35-43
Gender Male, 98.7%
Sad 74.7%
Surprised 12.8%
Confused 4%
Calm 2.1%
Disgusted 1.9%
Fear 1.5%
Happy 1.5%
Angry 1.4%

AWS Rekognition

Age 38-46
Gender Male, 99.9%
Happy 92.2%
Disgusted 5.1%
Calm 0.8%
Confused 0.8%
Surprised 0.5%
Angry 0.2%
Sad 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Categories

Imagga

paintings art 99.4%

Text analysis

Amazon

TAKEN
MAGAZINE
DAY
TMAS DAY
THE
TMAS
E
TAKEN HE
SUBSCRIPTION
21434.
HE
В
3
|
3 I
SEKODIN | NAMTCA
SEKODIN
send
NAMTCA
I

Google

TMA DAY MAGAZINE SUBSCRIPTION TAKEN HEr- 1434. -ИАМ
TMA
DAY
MAGAZINE
SUBSCRIPTION
TAKEN
HEr-
1434.
-ИАМ