Human Generated Data

Title

Untitled (couple seated at table with candlesticks)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8575

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple seated at table with candlesticks)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8575

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.8
Human 99.8
Person 99.6
Person 99.2
Person 98.3
Hat 91
Clothing 91
Apparel 91
Sitting 84.7
Face 68.4
Meal 67.5
Food 67.5
Suit 62.2
Coat 62.2
Overcoat 62.2
Finger 60.9
Drawing 57.4
Art 57.4
Shirt 56.1

Clarifai
created on 2023-10-25

people 99.9
monochrome 99.1
group 97.6
woman 96.9
street 96.7
man 96
adult 95.4
music 95
administration 92.6
group together 92.6
two 91
child 89.9
three 88.5
musician 87
guitar 85.4
four 82.1
singer 81.5
wear 80.3
black and white 78.6
portrait 78.3

Imagga
created on 2022-01-09

barbershop 35.3
shop 29.8
man 28.2
mercantile establishment 22.8
male 22
people 21.7
person 16.2
black 15.8
place of business 15.2
business 14.6
city 14.1
silhouette 13.2
adult 13
men 12.9
old 12.5
street 12
indoors 11.4
hairdresser 11.1
portrait 10.3
world 10.2
hand 9.9
clothing 9.8
human 9.7
building 9.7
life 9.7
urban 9.6
office 9.5
lifestyle 9.4
fashion 9
window 8.8
businessman 8.8
love 8.7
inside 8.3
establishment 8.2
door 8
women 7.9
smile 7.8
boy 7.8
architecture 7.8
scene 7.8
travel 7.7
casual 7.6
suit 7.4
hat 7.4
light 7.3
alone 7.3
group 7.2
garment 7.1
face 7.1
job 7.1
working 7.1
work 7.1
modern 7

Google
created on 2022-01-09

Black-and-white 86.3
Coat 85.5
Style 83.9
Hat 81.6
Adaptation 79.3
Suit 78.3
Monochrome 77.5
Monochrome photography 75.9
Font 75.6
Eyewear 73.1
Vintage clothing 70.5
Event 70.3
Room 70.1
Art 67.1
History 63.1
Sitting 62.5
Visual arts 57.7
Classic 57.5
Photo caption 57.3

Microsoft
created on 2022-01-09

person 99.3
text 98.7
black and white 93.1
clothing 91.3
outdoor 89.8
man 61.3
human face 61
drawing 57

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Male, 99.3%
Calm 91.7%
Surprised 3.3%
Confused 1.4%
Disgusted 1.2%
Sad 1.2%
Angry 0.5%
Fear 0.4%
Happy 0.3%

AWS Rekognition

Age 18-26
Gender Female, 99.2%
Sad 53.5%
Happy 35.5%
Surprised 5.1%
Angry 2.1%
Calm 1.4%
Confused 1.1%
Disgusted 0.7%
Fear 0.6%

AWS Rekognition

Age 34-42
Gender Male, 93.2%
Calm 99.7%
Angry 0.1%
Confused 0.1%
Sad 0.1%
Happy 0%
Surprised 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 33-41
Gender Female, 78.1%
Calm 52.4%
Sad 45.3%
Angry 0.6%
Disgusted 0.6%
Fear 0.4%
Happy 0.3%
Confused 0.3%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Hat 91%

Categories

Text analysis

Amazon

17780.
a

Google

ררו
. 8oררן ם% ררו
.
8oררן
ם%