Human Generated Data

Title

Untitled (two men grinning at two young boys on floor of living room decorated for Christmas)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9154

Human Generated Data

Title

Untitled (two men grinning at two young boys on floor of living room decorated for Christmas)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.7
Human 99.7
Clothing 99.4
Apparel 99.4
Person 99.1
Person 99
Chair 83.8
Furniture 83.8
Face 83.1
Wheel 81.7
Machine 81.7
Airplane 79.9
Transportation 79.9
Vehicle 79.9
Aircraft 79.9
Wheel 78.9
Female 76
Plant 71.6
Indoors 68.1
Girl 66
Wedding 65.4
Bridegroom 65.4
People 65.2
Robe 65.1
Fashion 65.1
Portrait 65
Photography 65
Photo 65
Leisure Activities 64.5
Gown 62.9
Room 62
Suit 61.5
Coat 61.5
Overcoat 61.5
Wedding Gown 59.4
Kid 57.9
Child 57.9
Floor 57.2
Outdoors 57.1
Woman 56.4

Imagga
created on 2022-01-23

musical instrument 47.2
percussion instrument 34.5
man 31.6
people 28.4
male 24.1
hairdresser 22.3
drum 21.5
person 21
men 18.9
wind instrument 18.8
room 16.5
business 15.8
adult 14.7
chair 14.1
brass 13.9
group 13.7
table 13
work 12.8
worker 12.5
city 12.5
sitting 12
women 11.8
lifestyle 11.6
businessman 11.5
office 11.2
shop 11
happy 10.6
indoors 10.5
portrait 10.3
holding 9.9
hand 9.9
marimba 9.8
trombone 9.8
old 9.7
human 9.7
sax 9.7
outdoors 9.7
accordion 9.4
barbershop 9.3
indoor 9.1
playing 9.1
job 8.8
steel drum 8.8
smiling 8.7
teacher 8.7
modern 8.4
mature 8.4
teamwork 8.3
leisure 8.3
home 8
interior 8
working 7.9
device 7.9
together 7.9
urban 7.9
smile 7.8
life 7.8
keyboard instrument 7.7
studying 7.7
talking 7.6
fashion 7.5
study 7.4
street 7.4
back 7.3
music 7.3
school 7.2
team 7.2
classroom 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 94.2
black and white 93.6
outdoor 93.6
person 92.1
clothing 89.2
street 82
footwear 76.7
man 71.7
monochrome 52.6

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Male, 97%
Happy 91.7%
Calm 3.4%
Confused 1.8%
Sad 1.3%
Surprised 0.6%
Disgusted 0.6%
Angry 0.3%
Fear 0.3%

AWS Rekognition

Age 34-42
Gender Female, 61.3%
Calm 99.2%
Sad 0.2%
Confused 0.2%
Happy 0.1%
Surprised 0.1%
Angry 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 36-44
Gender Female, 62.4%
Calm 93.2%
Sad 5.9%
Confused 0.4%
Angry 0.2%
Surprised 0.1%
Disgusted 0.1%
Happy 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Wheel 81.7%
Airplane 79.9%

Captions

Microsoft

a group of people sitting in front of a building 67.5%
a group of people sitting on a bench 48.8%
a group of people standing in front of a building 48.7%

Text analysis

Amazon

00
MJIF
50
Adgats
MJIF ACTMA
ACTMA

Google

MJI7 YT33A2 MA
MJI7
YT33A2
MA