Human Generated Data

Title

Untitled (bride, groom, and guests by car)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16520

Human Generated Data

Title

Untitled (bride, groom, and guests by car)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Nature 99.4
Person 98.7
Human 98.7
Person 96.9
Person 96.7
Smoke 94.1
Person 91.6
Fog 90.8
Apparel 88.8
Clothing 88.8
Person 88.7
Person 83.8
Smog 82.6
Aircraft 81.5
Vehicle 81.5
Airplane 81.5
Transportation 81.5
Outdoors 78.1
People 64.3
Coat 60.2
Pedestrian 59.5
Weather 58.9
Drawing 57.6
Art 57.6
Pollution 55.5

Imagga
created on 2022-02-11

office 22.3
man 22.2
travel 21.1
room 20.6
people 18.4
ocean 17.6
business 17.6
sea 17.4
transportation 17
water 16.7
sky 16.6
classroom 15.5
table 14.9
chair 14.1
tourism 14
boat 13.8
person 12.8
professional 12.7
work 12.6
airport 12.5
vacation 12.3
adult 12.1
computer 12
men 12
building 11.9
architecture 11.8
beach 11.8
lifestyle 11.6
businessman 11.5
indoors 11.4
modern 11.2
luxury 11.1
transport 11
relaxation 10.9
desk 10.8
city 10.8
ship 10.7
sitting 10.3
summer 10.3
day 10.2
male 9.9
coast 9.9
group 9.7
urban 9.6
bay 9.6
relax 9.3
teacher 9.3
device 8.9
interior 8.8
sand 8.8
holiday 8.6
glass 8.6
journey 8.5
yacht 8.5
communication 8.4
tourist 8.3
outdoors 8.2
working 7.9
hall 7.9
cruise 7.8
space 7.8
corporate 7.7
port 7.7
bridge 7.7
meeting 7.5
horizontal 7.5
laptop 7.4
technology 7.4
vehicle 7.4
structure 7.3
occupation 7.3
reflection 7.3
equipment 7.2
recreation 7.2
team 7.2

Microsoft
created on 2022-02-11

text 99.4
vehicle 95.5
land vehicle 93.9
car 92.9
outdoor 90.3
black and white 86.6
clothing 75.7
person 73.3
man 53.1

Face analysis

Amazon

Google

AWS Rekognition

Age 18-26
Gender Male, 59.9%
Sad 78.2%
Calm 12.8%
Confused 5.7%
Angry 1.8%
Disgusted 0.6%
Fear 0.3%
Happy 0.3%
Surprised 0.3%

AWS Rekognition

Age 22-30
Gender Male, 94.7%
Calm 99.6%
Sad 0.1%
Happy 0.1%
Disgusted 0.1%
Angry 0%
Confused 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 35-43
Gender Male, 99.9%
Calm 88.1%
Sad 7.3%
Confused 1.4%
Surprised 1%
Disgusted 0.8%
Happy 0.7%
Angry 0.4%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%
Airplane 81.5%

Captions

Microsoft

a group of people standing in front of a car 90.5%
a group of people standing in front of a car posing for the camera 87.4%
a group of people posing for a photo in front of a car 85.9%

Text analysis

Amazon

MILK
2
HOTEL
47
Fre MILK
Fre

Google

Er
HOTEL
A2
--
MJ
AGON
-YT3
47 HOTEL Er MILK MJ」ヨー-YT3ヨA2--AGON
47
MILK