Human Generated Data

Title

Untitled (seven young men and women sitting, talking, drinking and smoking on stairwell)

Date

1941

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9027

Human Generated Data

Title

Untitled (seven young men and women sitting, talking, drinking and smoking on stairwell)

People

Artist: Martin Schweig, American 20th century

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9027

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.7
Human 99.7
Person 99
Person 98.3
Person 98.2
Person 98.1
Person 97.8
Person 97.1
Person 95.1
Worker 90.3
Person 85.5
Clothing 79.8
Apparel 79.8
Hairdresser 68.8
Photography 60.9
Photo 60.9
Portrait 60
Face 60

Clarifai
created on 2023-10-26

people 99.9
group 99.1
man 97.8
adult 96.8
education 96.6
monochrome 95.9
child 95.9
woman 95.8
teacher 92
hospital 88.6
medical practitioner 88.6
group together 88.1
indoors 87.6
administration 84.9
leader 83.8
interaction 82.6
uniform 82.5
science 81.2
school 80.6
three 78.7

Imagga
created on 2022-01-23

groom 38.3
man 26.9
people 26.8
person 24.8
male 24.1
couple 20.9
brass 20.9
adult 20.8
cornet 20.2
businessman 18.5
men 17.2
wind instrument 16.5
business 15.8
office 15.3
women 14.2
black 13.8
happy 13.8
musical instrument 13.7
home 13.6
room 13.1
happiness 12.5
professional 12.4
indoors 12.3
group 12.1
bride 11.9
kin 11.4
love 11
wedding 11
dress 10.8
television 10.8
silhouette 10.8
family 10.7
window 10.6
corporate 10.3
smiling 9.4
team 9
executive 8.7
sitting 8.6
smile 8.5
businesspeople 8.5
desk 8.5
portrait 8.4
modern 8.4
hand 8.3
world 8.3
suit 8.3
indoor 8.2
worker 8.2
computer 8
together 7.9
bouquet 7.7
nurse 7.6
meeting 7.5
one 7.5
event 7.4
back 7.3
successful 7.3
celebration 7.2
job 7.1
working 7.1
work 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.2
clothing 96
person 90.9
human face 85.6
man 83
drawing 67.6
woman 65

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 96.8%
Calm 88.5%
Sad 9.4%
Angry 0.7%
Happy 0.5%
Surprised 0.4%
Disgusted 0.2%
Confused 0.2%
Fear 0.1%

AWS Rekognition

Age 48-54
Gender Male, 99.3%
Calm 99.9%
Happy 0%
Confused 0%
Disgusted 0%
Fear 0%
Sad 0%
Surprised 0%
Angry 0%

AWS Rekognition

Age 25-35
Gender Male, 99.8%
Calm 95.7%
Sad 2.4%
Happy 1%
Surprised 0.3%
Confused 0.2%
Angry 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 33-41
Gender Male, 99.9%
Calm 85.6%
Sad 9.4%
Confused 2.2%
Happy 1.7%
Disgusted 0.4%
Surprised 0.3%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 16-22
Gender Male, 93.3%
Calm 94.2%
Sad 3%
Happy 0.9%
Confused 0.9%
Disgusted 0.4%
Angry 0.3%
Surprised 0.3%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Categories

Imagga

interior objects 98.8%

Text analysis

Amazon

M 113
ЭТАЯТIЙ
M 113 ЭТАЯТIЙ A7AA
A7AA

Google

MI 3TARTIAROA
MI
3TARTIAROA