Human Generated Data

Title

Untitled (young children in a room with toys)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8402

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (young children in a room with toys)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 99.6
Person 99.6
Person 99.5
Person 98.2
Person 97.9
Person 96.5
Person 96.3
Interior Design 96.1
Indoors 96.1
Person 94
Person 92.9
Room 87.5
Person 83.6
Furniture 83.4
Person 81.8
Crowd 76.9
Person 74.3
Mammal 72.2
Pet 72.2
Dog 72.2
Canine 72.2
Animal 72.2
Person 72
People 68.7
Person 68.2
Person 66.1
Photo 65.5
Photography 65.5
Face 65.4
Portrait 63.6
Female 62.2
Girl 62.2
Person 62.2
Classroom 59.2
School 59.2
Audience 57.9
Paper 57.6
Text 56.1
Living Room 55.9
Poster 55.7
Advertisement 55.7
Bedroom 55.2

Imagga
created on 2022-01-09

shop 34.6
classroom 32.3
people 31.2
man 30.9
room 26.9
shoe shop 26.4
person 25.1
mercantile establishment 23.9
male 23.4
men 21.4
business 21.2
blackboard 19.6
adult 19.2
group 16.1
teacher 15.8
place of business 15.8
city 14.9
interior 14.1
businessman 14.1
equipment 14
urban 14
hockey stick 13.6
board 12.8
human 12.7
education 12.1
office 11.7
team 11.6
chair 11.3
lifestyle 10.8
stick 10.8
indoors 10.5
modern 10.5
boy 10.4
school 10.3
floor 10.2
black 10.2
indoor 10
teaching 9.7
success 9.6
women 9.5
meeting 9.4
sports equipment 9.4
house 9.2
studio 9.1
girls 9.1
silhouette 9.1
music 9
worker 8.9
musician 8.9
job 8.8
building 8.8
standing 8.7
class 8.7
work 8.6
student 8.6
guitar 8.6
active 8.4
center 8.3
window 8.2
happy 8.1
supermarket 8.1
handsome 8
working 7.9
students 7.8
sport 7.7
wall 7.7
crowd 7.7
casual 7.6
walk 7.6
finance 7.6
establishment 7.5
holding 7.4
style 7.4
shopping 7.3
professional 7.3
home 7.2
transportation 7.2
portrait 7.1
to 7.1
travel 7
child 7

Google
created on 2022-01-09

Photograph 94.2
Window 93.8
Building 90.4
Black 89.5
Black-and-white 84.7
Style 83.9
Art 81.7
Adaptation 79.3
Snapshot 74.3
Urban design 74.3
Chair 73.5
Musician 73
Font 72.9
Monochrome photography 72.1
Monochrome 71.6
Event 68.6
Plant 67.6
Room 67.2
Music 67.2
Visual arts 67

Microsoft
created on 2022-01-09

text 99.9
person 81.4
black and white 52.5

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 86.4%
Calm 81.3%
Happy 11.3%
Confused 2.8%
Sad 1.8%
Angry 1%
Surprised 0.8%
Disgusted 0.7%
Fear 0.3%

AWS Rekognition

Age 37-45
Gender Male, 98.7%
Sad 70.9%
Calm 21%
Confused 3.6%
Disgusted 1.8%
Happy 1%
Angry 1%
Surprised 0.4%
Fear 0.3%

AWS Rekognition

Age 29-39
Gender Male, 78.1%
Sad 90.7%
Calm 7.4%
Happy 0.7%
Confused 0.4%
Surprised 0.3%
Angry 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 18-24
Gender Male, 70.3%
Calm 66.9%
Fear 24.3%
Surprised 4.3%
Sad 1.6%
Happy 1.4%
Angry 0.9%
Confused 0.4%
Disgusted 0.3%

AWS Rekognition

Age 30-40
Gender Female, 65.1%
Confused 45.1%
Sad 22.7%
Disgusted 14.5%
Calm 10.7%
Angry 2.5%
Happy 2.4%
Surprised 1.3%
Fear 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Dog 72.2%

Captions

Microsoft

graphical user interface 19.2%

Text analysis

Amazon

12066
12066.
ИА
TERAS ИА
TERAS

Google

12066
12066 12066 12066.
12066.