Human Generated Data

Title

Untitled (couples holding drinks and socializing)

Date

c. 1965

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11303

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couples holding drinks and socializing)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11303

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.8
Human 99.8
Person 99.7
Person 99.7
Apparel 99.6
Clothing 99.6
Person 99.4
Person 99.1
Person 98.5
Person 97.7
Shorts 95.2
Person 95
Person 92.7
Sunglasses 87
Accessories 87
Accessory 87
Female 83.2
Person 81.7
Person 77.4
Person 73.2
Poster 68.2
Advertisement 68.2
Woman 66.2
Pants 66
People 64
Crowd 61.9
Dress 61.7
Sleeve 60.4
Indoors 59
Person 57.7
Girl 57.3
Paper 57.1
Skirt 56.5
Standing 55.4

Clarifai
created on 2023-10-25

people 99.7
group 98.9
group together 98.6
monochrome 98
man 97.6
woman 97.3
adult 95.8
five 92.2
several 92.1
many 91.1
music 90.5
dancing 89.6
actor 85
administration 84.4
four 83.7
child 83.3
wear 83
recreation 82.2
leader 79.3
three 78.6

Imagga
created on 2022-01-09

brass 38.9
people 36.8
business 34
male 33.3
men 31.8
man 30.9
wind instrument 30.4
businessman 29.1
trombone 28.5
corporate 27.5
group 27.4
person 26.6
office 25.9
adult 24.6
musical instrument 24.5
women 23.7
meeting 22.6
professional 21.1
work 20.4
job 20.3
team 19.7
success 17.7
businesswoman 17.3
teamwork 16.7
happy 16.3
modern 15.4
suit 15.3
room 15
hall 14.9
building 14.7
teacher 14.5
manager 14
table 13.8
photographer 13.8
smiling 13.7
executive 13.4
communication 13.4
adults 13.2
together 13.1
diversity 12.5
talking 12.3
worker 12.3
lifestyle 12.3
standing 12.2
life 11.6
indoors 11.4
career 11.3
successful 11
corporation 10.6
human 10.5
ethnic 10.5
couple 10.4
businesspeople 10.4
hands 10.4
two 10.2
laptop 10
smile 10
silhouette 9.9
handsome 9.8
businessmen 9.7
interior 9.7
colleagues 9.7
boss 9.6
education 9.5
chair 9.5
occupation 9.2
confident 9.1
attractive 9.1
portrait 9.1
black 9
diverse 8.8
handshake 8.8
home 8.8
entrepreneur 8.8
40s 8.8
day 8.6
hand 8.3
20s 8.2
indoor 8.2
restaurant 8.1
educator 8
urban 7.9
happiness 7.8
discussion 7.8
daytime 7.7
pretty 7.7
partnership 7.7
casual 7.6
presentation 7.4
clothing 7.3
musician 7.2
microphone 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 99.5
clothing 97.9
man 92.6
standing 89.1
text 81.2
woman 71.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 65.1%
Calm 97.7%
Happy 1.5%
Sad 0.2%
Angry 0.2%
Surprised 0.1%
Fear 0.1%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 47-53
Gender Male, 87.5%
Sad 56.1%
Calm 29.8%
Happy 8%
Confused 3.8%
Angry 1.1%
Disgusted 0.6%
Fear 0.3%
Surprised 0.3%

AWS Rekognition

Age 50-58
Gender Male, 99.6%
Happy 94.8%
Disgusted 1.7%
Calm 1.5%
Confused 0.5%
Angry 0.4%
Surprised 0.4%
Sad 0.3%
Fear 0.3%

AWS Rekognition

Age 48-56
Gender Male, 96.2%
Calm 52.3%
Happy 45.5%
Surprised 0.5%
Angry 0.4%
Disgusted 0.4%
Confused 0.3%
Sad 0.3%
Fear 0.1%

AWS Rekognition

Age 49-57
Gender Female, 89.4%
Calm 92.7%
Sad 6.8%
Confused 0.1%
Happy 0.1%
Disgusted 0.1%
Angry 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 41-49
Gender Female, 62.1%
Calm 80.7%
Sad 8.4%
Happy 3.9%
Surprised 2.9%
Angry 1.5%
Fear 1%
Disgusted 1%
Confused 0.7%

AWS Rekognition

Age 45-53
Gender Male, 66.1%
Calm 98.4%
Happy 0.9%
Sad 0.3%
Angry 0.1%
Disgusted 0.1%
Confused 0.1%
Surprised 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.8%

Categories

Text analysis

Amazon

RAGON
87517
YT33A'2