Human Generated Data

Title

Untitled (nine guests sitting at Bar Mitzvah banquet table)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9222

Human Generated Data

Title

Untitled (nine guests sitting at Bar Mitzvah banquet table)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.8
Person 99.8
Person 99.3
Helmet 99.2
Apparel 99.2
Clothing 99.2
Person 99.1
Person 98.7
Person 98
Sunglasses 97.6
Accessories 97.6
Accessory 97.6
Person 97.1
Person 94
Face 92.5
Furniture 91.2
Chair 91.2
Table 91
Person 86.3
Hat 86.2
Dining Table 85.6
Person 81.1
Meal 80
Food 80
Sitting 76.2
Dish 74.8
Glasses 74.4
Helmet 74.3
Suit 73.7
Coat 73.7
Overcoat 73.7
Helmet 73.1
Photography 71.7
Portrait 71.7
Photo 71.7
Drawing 70.9
Art 70.9
People 70.1
Female 67.8
Head 60.4
Man 58.6
Scientist 57.4
Shirt 56.8
Bed 55.7
Text 55.1
Person 50.9

Imagga
created on 2022-01-23

man 34.2
male 31.2
person 29.3
people 24.5
work 22.7
iron lung 22.1
student 19.7
education 19
adult 18.9
respirator 17.7
business 17.6
room 17.5
businessman 16.7
brass 16.3
classroom 15
technology 14.8
human 14.2
science 14.2
device 13.9
professional 13.6
teacher 13.4
breathing device 13.4
musical instrument 13.3
working 13.2
medical 13.2
senior 13.1
blackboard 13
job 12.4
wind instrument 12.1
group 12.1
men 12
team 11.6
worker 11.5
hand 11.4
office 11.3
cornet 11.2
looking 11.2
equipment 10.8
school 10.8
medicine 10.6
plan 10.4
nurse 10
old 9.7
teaching 9.7
portrait 9.7
class 9.6
computer 9.6
doctor 9.4
teamwork 9.3
music 9.1
modern 9.1
astronaut 9
health 9
writing 9
surgeon 8.9
indoors 8.8
happy 8.8
world 8.8
chemistry 8.7
smiling 8.7
engineer 8.6
clinic 8.6
chart 8.6
drawing 8.6
sitting 8.6
research 8.5
finance 8.4
manager 8.4
occupation 8.2
smile 7.8
black 7.8
diagram 7.7
engineering 7.6
instrument 7.5
help 7.4
holding 7.4
patient 7.4
laptop 7.3
suit 7.2
to 7.1

Google
created on 2022-01-23

Shirt 94.1
Hat 86.7
Black-and-white 84.2
Style 83.8
Headgear 82.5
Art 80
Font 77.9
Motor vehicle 77.4
Monochrome photography 73.6
Monochrome 73.5
Vintage clothing 73.4
Crew 72.5
Event 71
Fedora 70.1
T-shirt 69.2
Team 68.8
Room 68
Sun hat 66.6
Illustration 64.6
History 64.4

Microsoft
created on 2022-01-23

text 99.3
person 95.4
book 94.4
clothing 93.2
human face 87.6
man 83
drawing 68.2
player 67.9
posing 65.7
old 45

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Male, 99.9%
Calm 77.6%
Happy 9.1%
Confused 5%
Sad 3.5%
Disgusted 1.9%
Surprised 1.2%
Angry 1.1%
Fear 0.6%

AWS Rekognition

Age 23-33
Gender Male, 85.4%
Happy 87.5%
Sad 9.6%
Surprised 0.9%
Calm 0.7%
Confused 0.4%
Angry 0.3%
Disgusted 0.3%
Fear 0.2%

AWS Rekognition

Age 27-37
Gender Female, 90.6%
Happy 77.6%
Surprised 19.6%
Sad 1.1%
Confused 0.6%
Angry 0.3%
Disgusted 0.3%
Fear 0.3%
Calm 0.2%

AWS Rekognition

Age 29-39
Gender Male, 99.9%
Calm 51.4%
Angry 31.2%
Surprised 8.5%
Happy 5.4%
Disgusted 1.8%
Sad 0.7%
Fear 0.6%
Confused 0.6%

AWS Rekognition

Age 27-37
Gender Male, 81.4%
Happy 60.5%
Calm 22%
Surprised 7.7%
Sad 6.5%
Fear 1%
Confused 0.9%
Disgusted 0.9%
Angry 0.6%

AWS Rekognition

Age 33-41
Gender Male, 87.5%
Surprised 97.4%
Calm 0.8%
Happy 0.6%
Fear 0.6%
Sad 0.2%
Confused 0.2%
Angry 0.1%
Disgusted 0.1%

AWS Rekognition

Age 24-34
Gender Female, 61.6%
Happy 74.7%
Surprised 11.1%
Calm 4.8%
Sad 3.3%
Confused 2%
Disgusted 2%
Angry 1.1%
Fear 0.9%

AWS Rekognition

Age 52-60
Gender Male, 100%
Sad 77.1%
Confused 11.5%
Disgusted 4.1%
Surprised 2.3%
Calm 1.7%
Angry 1.3%
Fear 1.2%
Happy 0.9%

AWS Rekognition

Age 23-31
Gender Female, 95.1%
Happy 53.8%
Calm 17.9%
Fear 12.3%
Sad 8.9%
Surprised 2.5%
Angry 1.7%
Disgusted 1.5%
Confused 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Helmet 99.2%
Sunglasses 97.6%
Hat 86.2%

Captions

Microsoft

a group of people posing for a photo 80.7%
a group of people posing for the camera 80.6%
a group of men posing for a photo 76.4%

Text analysis

Amazon

YT3RAS
MJ17 YT3RAS ALSAN
MJ17
$ 2805
ALSAN

Google

MJ17
YT3RA
MJ17 YT3RA