Human Generated Data

Title

Untitled (newleyweds getting into car)

Date

c. 1970

People

Artist: Ken Whitmire Associates, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19822

Human Generated Data

Title

Untitled (newleyweds getting into car)

People

Artist: Ken Whitmire Associates, American

Date

c. 1970

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.3
Apparel 99.3
Human 98.2
Person 98.2
Person 98.1
Person 97.7
Person 90.1
Overcoat 90.1
Tie 81.7
Accessories 81.7
Accessory 81.7
Art 80.4
Drawing 77.8
Sketch 66.3
Sleeve 66.3
Face 64
Suit 61.1
Portrait 61
Photography 61
Photo 61
Coat 58.6
Crowd 58.3
Pedestrian 57
Female 56.8
Person 55.1

Imagga
created on 2022-03-05

man 25.5
person 25
people 22.9
silhouette 22.3
business 20.6
sky 19.8
male 18.4
outdoors 16.4
success 16.1
work 15.9
businessman 15.9
group 14.5
men 13.7
ski 13.3
adult 13.3
manager 13
successful 12.8
sunset 11.7
black 10.8
job 10.6
building 10.6
professional 10.4
corporate 10.3
company 10.2
happy 10
travel 9.9
tourist 9.7
painter 9.6
laptop 9.5
hand 9.1
businesswoman 9.1
executive 9
lifestyle 8.7
happiness 8.6
arms 8.6
chart 8.6
suit 8.1
team 8.1
sun 8
worker 8
looking 8
engineer 8
day 7.8
couple 7.8
hands 7.8
traveler 7.8
color 7.8
attractive 7.7
achievement 7.7
old 7.7
outdoor 7.6
clouds 7.6
teacher 7.5
technology 7.4
sport 7.4
world 7.4
vacation 7.4
office 7.3
water 7.3
design 7.3
smiling 7.2
active 7.2
light 7.2
summer 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

outdoor 97.6
text 94.1
drawing 90.3
black and white 85.1
clothing 84.2
sketch 82.5
person 76
man 65.2
posing 41.7

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Male, 91.8%
Happy 59.1%
Sad 33%
Confused 3.7%
Calm 1.1%
Disgusted 1%
Surprised 1%
Angry 0.7%
Fear 0.5%

AWS Rekognition

Age 48-54
Gender Male, 80.7%
Confused 61.9%
Surprised 7.8%
Happy 7.6%
Calm 6.6%
Fear 5.7%
Disgusted 5.2%
Sad 2.8%
Angry 2.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.2%
Tie 81.7%
Suit 61.1%
Coat 58.6%

Captions

Microsoft

a group of people posing for a photo 85.6%
a group of people posing for the camera 85.5%
a group of people posing for a picture 85.4%

Text analysis

Amazon

36
UCK
KODAK-AEITW