Human Generated Data

Title

Untitled (couples dancing in room with moosehead)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7653

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couples dancing in room with moosehead)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7653

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 99.6
Person 99.6
Person 99.5
Person 99.3
Person 97
Person 96
Person 95.8
Interior Design 94.5
Indoors 94.5
Clothing 93.7
Apparel 93.7
Person 91.2
Person 90.6
Person 87
Person 85.1
Person 82
Person 79.1
Face 77.6
Text 72.8
People 72.5
Person 72.1
Person 72
Person 70.3
Person 67.5
Photography 65.5
Photo 65.5
Crowd 65.5
Person 63.9
Person 63.9
Person 63.8
Person 63.3
Portrait 62.7
Person 61
Meal 58.7
Food 58.7
Overcoat 57.9
Suit 57.9
Coat 57.9
Head 57.3
Poster 56.6
Advertisement 56.6
Female 56.4
Room 55.5
Person 54.2
Person 44.5

Clarifai
created on 2023-10-25

people 99.7
group 99.2
woman 98
many 96.8
man 95.5
group together 94.4
adult 92.5
dancing 91.2
indoors 90.8
leader 89.9
music 87.6
chair 87.6
several 83.1
education 82.9
furniture 81.5
administration 81.3
monochrome 80.1
ceremony 78.7
room 78
wedding 77.7

Imagga
created on 2022-01-08

teacher 78.4
educator 48.8
professional 48.3
person 45.5
man 42.3
adult 41.5
brass 40.9
male 39
cornet 38.9
businessman 38.8
people 36.8
business 35.8
wind instrument 33.9
office 25.7
team 25.1
group 25
education 23.4
musical instrument 22.3
student 21.7
meeting 21.7
job 20.3
businesswoman 20
corporate 19.8
executive 19.5
businesspeople 19
men 18.9
work 18.8
teamwork 18.5
indoor 18.3
room 17.9
classroom 17.5
blackboard 17.3
modern 16.8
women 15.8
board 15.4
school 15.3
communication 15.1
table 14.7
teaching 14.6
class 14.5
portrait 14.2
presentation 14
entrepreneur 13.8
occupation 13.7
handsome 13.4
sitting 12.9
leader 12.5
employee 12.4
boss 12.4
looking 12
worker 11.6
crowd 11.5
hand 11.4
chair 11.4
desk 11.3
couple 11.3
happy 11.3
study 11.2
successful 11
audience 10.7
manager 10.2
black 10.2
casual 10.2
laptop 10
silhouette 9.9
life 9.7
computer 9.6
studying 9.6
design 9.6
hands 9.6
career 9.5
smiling 9.4
bright 9.3
confident 9.1
suit 9
human 9
color 8.9
success 8.8
working 8.8
math 8.8
cheering 8.8
nighttime 8.8
speech 8.8
indoors 8.8
stadium 8.8
vibrant 8.8
symbol 8.7
boy 8.7
lifestyle 8.7
patriotic 8.6
formal 8.6
nation 8.5
youth 8.5
mature 8.4
lights 8.3
training 8.3
flag 8.3
oboe 7.9
chalkboard 7.8
mathematics 7.8
standing 7.8
conference 7.8
students 7.8
businessperson 7.8
partners 7.8
hall 7.7
attractive 7.7
exam 7.7
talking 7.6
finance 7.6
college 7.6
smart 7.5
holding 7.4
phone 7.4
clothing 7.4
smile 7.1
icon 7.1

Google
created on 2022-01-08

Photograph 94.1
Black 89.7
Coat 89.3
Picture frame 87.6
Black-and-white 87.2
Style 84.1
Art 82.9
Monochrome photography 78.2
Monochrome 77.8
Suit 76.5
Snapshot 74.3
Event 73.6
Room 71.5
Formal wear 66.2
Font 66
Stock photography 65.3
Vintage clothing 64.7
Visual arts 64.7
Chair 62
Hat 61.3

Microsoft
created on 2022-01-08

text 99.1
person 96.6
indoor 92.8
clothing 88.5
wedding dress 85.7
flower 83.4
woman 78.1
bride 64.9
man 52.6
vase 51.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 52%
Calm 38.7%
Surprised 32.3%
Sad 10.7%
Happy 10.5%
Confused 3%
Angry 2.6%
Disgusted 1.4%
Fear 0.8%

AWS Rekognition

Age 16-24
Gender Female, 84.4%
Calm 98.4%
Sad 1.3%
Happy 0.1%
Angry 0.1%
Confused 0%
Disgusted 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 20-28
Gender Female, 95.2%
Calm 65.5%
Fear 25%
Happy 5%
Surprised 1.9%
Disgusted 0.9%
Sad 0.8%
Angry 0.7%
Confused 0.1%

AWS Rekognition

Age 42-50
Gender Female, 72.9%
Calm 97.7%
Sad 1.5%
Disgusted 0.2%
Confused 0.2%
Angry 0.1%
Happy 0.1%
Surprised 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Text analysis

Amazon

19779
17179.

Google

17179
מ*
כ-ךרב
17179 מ* כ-ךרב