Human Generated Data

Title

Untitled (men helping bride get into car)

Date

1957

People

Artist: Ken Whitmire Associates, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19833

Human Generated Data

Title

Untitled (men helping bride get into car)

People

Artist: Ken Whitmire Associates, American

Date

1957

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.9
Human 98.9
Person 98.8
Clothing 96.2
Apparel 96.2
Shorts 89.5
Transportation 88.4
Automobile 88.4
Vehicle 88.4
Car 88.4
Coat 72.3
Overcoat 72.3
Home Decor 68.4
Meal 61.4
Food 61.4
Suit 60.7
Sleeve 59.7
Wheel 58.8
Machine 58.8
Person 58.5
Long Sleeve 57.9
Pedestrian 56.9
Spoke 56.5
Kiosk 56.3

Imagga
created on 2022-03-05

people 35.1
man 30.2
businessman 30
business 29.8
professional 29.1
male 28.4
corporate 25.8
adult 25.6
person 25.4
group 19.3
men 18.9
team 18.8
teamwork 18.5
suit 18.2
office 17.9
meeting 17
happy 16.9
manager 16.8
success 16.1
businesswoman 15.4
building 15.1
women 15
work 14.9
teacher 13.6
executive 13.4
job 13.3
smiling 13
successful 12.8
pretty 12.6
attractive 12.6
clothing 12.3
worker 12
lifestyle 11.6
holding 11.5
boss 11.5
modern 11.2
black 11.1
room 10.9
bride 10.7
working 10.6
indoors 10.5
businesspeople 10.4
portrait 10.3
communication 10.1
dress 9.9
stage 9.9
groom 9.9
hands 9.6
ethnic 9.5
smile 9.3
wedding 9.2
occupation 9.2
laptop 9.1
looking 8.8
love 8.7
happiness 8.6
workplace 8.6
adults 8.5
human 8.2
new 8.1
sexy 8
20 24 years 7.9
urban 7.9
handshake 7.8
model 7.8
colleagues 7.8
full length 7.8
diversity 7.7
partnership 7.7
finance 7.6
hand 7.6
outdoors 7.6
career 7.6
fashion 7.5
clothes 7.5
commerce 7.5
presentation 7.4
instrument 7.4
case 7.2
educator 7.2
color 7.2
day 7.1

Microsoft
created on 2022-03-05

person 98
text 95.9
black and white 95
clothing 93.6
man 85.7
outdoor 85.5
ship 80.8
standing 77.5
monochrome 66.1

Feature analysis

Amazon

Person 98.9%
Car 88.4%
Wheel 58.8%

Captions

Microsoft

a man and a woman looking at the camera 52.5%
a man standing next to a woman 52.4%
a man and a woman standing in front of a building 52.3%

Text analysis

Amazon

First
-
- -
METHODIST
40

Google

MJI7--Y T 3RA°2--XAGON
MJI7--Y
3RA°2--XAGON
T