Human Generated Data

Title

Untitled (group portrait of Pan Am Life Insurance Company of New Orleans)

Date

1973

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8140

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group portrait of Pan Am Life Insurance Company of New Orleans)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1973

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8140

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Clothing 99.9
Apparel 99.9
Human 99.5
Person 99.5
Person 99.4
Person 99.4
Person 98.5
Person 98.5
Person 98.2
Person 97.3
Person 97
Dress 96.9
Face 95.8
Female 92.9
Shorts 92.1
Pants 91.4
Smile 88.2
Shirt 85.9
Overcoat 84.3
Coat 84.3
Suit 84.3
Woman 78
Footwear 77.9
Shoe 77.9
Sleeve 74.5
Portrait 73.9
Photography 73.9
Photo 73.9
Chair 73
Furniture 73
People 71.8
Sailor Suit 71.8
Shoe 70.5
Man 68.2
Girl 67.6
Long Sleeve 65.2
Tie 65
Accessories 65
Accessory 65
Kid 61.4
Child 61.4
Boy 56.1
Person 47.1

Clarifai
created on 2023-10-25

people 99.7
group 97.5
group together 97.3
wear 95.6
adult 94.9
man 94.3
woman 93.7
uniform 92.7
portrait 90.7
several 82
leader 80.4
many 79.9
education 79.4
outfit 78.3
partnership 75.5
facial expression 75.4
five 72.2
actor 72
medical practitioner 68.8
military 66.2

Imagga
created on 2022-01-08

people 27.9
business 20.6
male 20.6
man 20.4
businessman 19.4
person 18.7
men 18
crowd 16.3
team 16.1
group 16.1
work 15.7
adult 15
black 14.5
silhouette 14.1
businesswoman 13.6
crutch 13.1
clothing 13
women 12.6
teamwork 12
street 12
city 11.6
walking 11.4
corporate 11.2
suit 11.1
urban 10.5
scene 10.4
worker 10.2
staff 10.1
occupation 10.1
fashion 9.8
job 9.7
success 9.6
sexy 9.6
light 9.3
world 9.1
old 9.1
human 9
life 9
building 8.7
standing 8.7
wall 8.5
shop 8.5
design 8.4
clothes 8.4
outfit 8.1
dress 8.1
stick 8.1
couple 7.8
architecture 7.8
audience 7.8
portrait 7.8
travel 7.7
leader 7.7
boss 7.6
walk 7.6
flag 7.6
wind instrument 7.5
garment 7.5
happy 7.5
style 7.4
body 7.2
mannequin 7.2
professional 7.1
indoors 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.1
clothing 93.7
footwear 93.6
posing 88
person 85.8
woman 76.3
black and white 74.6
man 73.4
smile 67.3
group 57.6
clothes 49.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Male, 85.7%
Calm 90.7%
Sad 3%
Happy 2.6%
Confused 1.3%
Disgusted 0.8%
Surprised 0.7%
Angry 0.5%
Fear 0.4%

AWS Rekognition

Age 39-47
Gender Male, 99.7%
Calm 75.9%
Sad 11%
Confused 8.3%
Happy 2.4%
Disgusted 0.8%
Angry 0.7%
Surprised 0.6%
Fear 0.4%

AWS Rekognition

Age 50-58
Gender Male, 99.8%
Calm 87.8%
Happy 9.9%
Sad 0.8%
Confused 0.7%
Surprised 0.3%
Disgusted 0.3%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 45-51
Gender Female, 92.4%
Happy 92.8%
Calm 5.7%
Sad 0.7%
Disgusted 0.2%
Confused 0.2%
Surprised 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 41-49
Gender Female, 96.1%
Calm 58%
Happy 29.4%
Sad 3.6%
Disgusted 3%
Angry 2.3%
Confused 1.7%
Fear 1%
Surprised 1%

AWS Rekognition

Age 49-57
Gender Female, 78.3%
Happy 99.7%
Calm 0.1%
Surprised 0.1%
Sad 0%
Disgusted 0%
Confused 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 30-40
Gender Female, 61.1%
Happy 68.6%
Calm 15.7%
Surprised 4.4%
Sad 3.6%
Disgusted 3.3%
Confused 2%
Fear 1.4%
Angry 1.1%

AWS Rekognition

Age 39-47
Gender Male, 81%
Calm 83.7%
Confused 3.8%
Happy 3.7%
Surprised 2.8%
Disgusted 2.6%
Sad 2.2%
Angry 0.7%
Fear 0.5%

AWS Rekognition

Age 31-41
Gender Male, 61.5%
Happy 86.2%
Calm 10.5%
Confused 1.4%
Sad 0.7%
Disgusted 0.5%
Surprised 0.4%
Angry 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Shoe 77.9%
Tie 65%

Text analysis

Amazon

60640.