Human Generated Data

Title

Untitled (three actors in living room performing play with four ministers)

Date

1953

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8756

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (three actors in living room performing play with four ministers)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8756

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.6
Human 99.6
Person 99.4
Person 99.3
Clothing 88.1
Apparel 88.1
Bench 78.6
Furniture 78.6
Suit 73.2
Overcoat 73.2
Coat 73.2
People 65.3
Photography 60.5
Photo 60.5
Crowd 59.6
Face 58.7
Priest 58.5

Clarifai
created on 2023-10-25

people 100
adult 98.6
furniture 98.2
group 97.8
man 96.6
group together 96
several 94.3
leader 93.5
chair 92.3
many 92.3
two 91.3
administration 91
three 89.4
room 88.3
woman 87.7
wear 87.2
music 85.8
outfit 85.8
monochrome 82.9
musician 81.5

Imagga
created on 2022-01-09

man 31.6
male 29.8
people 27.9
person 27.3
businessman 22
room 21.8
blackboard 21.3
men 20.6
barbershop 19.7
business 19.4
classroom 17.6
adult 17.1
shop 17.1
education 15.6
black 15
teacher 13.2
office 13.1
mercantile establishment 12.9
group 12.9
hand 12.2
human 12
school 11.8
class 11.6
casual 11
diagram 10.5
sign 10.5
success 10.5
old 10.4
looking 10.4
women 10.3
work 10.2
job 9.7
portrait 9.7
chart 9.6
student 9.3
board 9.2
computer 8.9
place of business 8.8
drawing 8.6
smile 8.5
silhouette 8.3
equipment 8.2
idea 8
building 8
design 7.9
face 7.8
teaching 7.8
patient 7.8
corporate 7.7
two 7.6
holding 7.4
symbol 7.4
indoor 7.3
suit 7.2
team 7.2
travel 7
professional 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.6
clothing 95.4
person 95.4
man 84.1
black and white 78.4
white 75.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 20-28
Gender Male, 85.7%
Calm 99.7%
Happy 0.1%
Sad 0.1%
Angry 0%
Confused 0%
Disgusted 0%
Fear 0%
Surprised 0%

AWS Rekognition

Age 24-34
Gender Male, 61.4%
Calm 99.9%
Sad 0%
Happy 0%
Surprised 0%
Angry 0%
Disgusted 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 41-49
Gender Male, 99.9%
Calm 98.7%
Sad 1.1%
Confused 0.1%
Surprised 0%
Happy 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 30-40
Gender Female, 69.5%
Happy 96.4%
Calm 1.6%
Sad 0.5%
Fear 0.4%
Surprised 0.3%
Angry 0.3%
Disgusted 0.2%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Bench 78.6%

Categories

Text analysis

Amazon

38
38619-A
KODA
38 619-A
619-A
٢8

Google

38619- A. 58 YT37A°2-AG 38619-A.
38619-
A.
58
YT37A°2-AG
38619-A.