Human Generated Data

Title

Untitled (boy and girl in costumes posed holding hands between two curtains)

Date

1942

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9039

Human Generated Data

Title

Untitled (boy and girl in costumes posed holding hands between two curtains)

People

Artist: Martin Schweig, American 20th century

Date

1942

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Clothing 100
Apparel 100
Person 99
Human 99
Female 98.6
Skirt 96.1
Person 95
Dress 94.6
Woman 93.2
Shoe 91.3
Footwear 91.3
Shoe 91.3
Shorts 85.4
Accessory 68.6
Sunglasses 68.6
Accessories 68.6
Girl 55.6
Standing 55.2

Imagga
created on 2022-01-23

people 30.1
man 23.5
person 23.4
male 20
adult 19.6
business 18.2
fashion 18.1
happy 15.7
businessman 15
pretty 14.7
attractive 13.3
black 13
work 12.7
world 12.7
happiness 12.5
blackboard 12.5
model 12.4
job 12.4
couple 12.2
professional 12.1
smile 12.1
men 12
women 11.9
portrait 11.6
hand 11.4
clothing 11.1
love 11
dress 10.8
family 10.7
bride 10.5
sexy 10.4
looking 10.4
window 10.3
bag 10.2
businesswoman 10
silhouette 9.9
lady 9.7
group 9.7
boss 9.6
youth 9.4
wedding 9.2
style 8.9
success 8.9
interior 8.8
indoors 8.8
crowd 8.6
corporate 8.6
walking 8.5
casual 8.5
two 8.5
travel 8.4
street 8.3
office 8.3
occupation 8.2
human 8.2
alone 8.2
indoor 8.2
one 8.2
briefcase 8.2
room 8.1
child 8.1
team 8.1
suit 8
cleaner 7.9
cute 7.9
together 7.9
executive 7.9
standing 7.8
leader 7.7
kin 7.6
elegance 7.6
store 7.6
holding 7.4
teamwork 7.4
symbol 7.4
girls 7.3
smiling 7.2
lifestyle 7.2
body 7.2
hair 7.1
face 7.1
life 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

clothing 95.8
text 94.1
window 91.9
person 88.3
dress 82.6
sketch 77
footwear 56.8
black and white 53.4
posing 45.5

Face analysis

Amazon

AWS Rekognition

Age 25-35
Gender Male, 98.2%
Happy 70.5%
Surprised 11.9%
Sad 4.8%
Calm 4.8%
Fear 4.1%
Disgusted 1.7%
Angry 1.2%
Confused 1%

AWS Rekognition

Age 34-42
Gender Female, 78.3%
Happy 93.9%
Calm 5.4%
Surprised 0.4%
Confused 0.1%
Disgusted 0.1%
Fear 0.1%
Sad 0%
Angry 0%

Feature analysis

Amazon

Person 99%
Shoe 91.3%
Sunglasses 68.6%

Captions

Microsoft

a man standing in front of a window 86.8%
a man standing next to a window 84.6%
a man standing in front of a window posing for the camera 79.3%

Text analysis

Amazon

8
M 117
M 117 YE3A A70A
A70A
YE3A

Google

YT3A
A3A
MI YT3A 2 A3A
2
MI