Human Generated Data

Title

Untitled (four men in Pittman & Son supermarket parking lot)

Date

1954

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3280

Human Generated Data

Title

Untitled (four men in Pittman & Son supermarket parking lot)

People

Artist: Harry Annas, American 1897 - 1980

Date

1954

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Human 99.7
Person 99.7
Person 99.7
Person 99.7
Person 99.5
Car 98.4
Automobile 98.4
Vehicle 98.4
Transportation 98.4
Officer 94.2
Military Uniform 94.2
Military 94.2
Car 92.8
Wheel 90.6
Machine 90.6
Car 88.8
Car 87.5
Person 84.8
Tarmac 77
Asphalt 77
Captain 67.1
Person 61.5

Imagga
created on 2022-01-22

limousine 77.3
car 75.1
motor vehicle 49.5
transportation 25.1
people 22.9
road 22.6
business 20
crowd 19.2
travel 19
wheeled vehicle 18.9
highway 18.3
male 17
man 16.8
street 16.6
silhouette 16.6
flag 16
businessman 15.9
city 15.8
teamwork 15.8
traffic 15.2
job 15
transport 14.6
businesswoman 14.5
team 14.3
person 14.3
audience 13.6
work 13.3
sky 12.8
nighttime 12.7
boss 12.4
symbol 12.1
vivid 12.1
occupation 11.9
president 11.8
cheering 11.8
asphalt 11.7
stadium 11.7
patriotic 11.5
nation 11.4
urban 11.4
presentation 11.2
lights 11.1
speed 11
summer 10.9
supporters 10.9
speech 10.8
outdoor 10.7
leader 10.6
vibrant 10.5
icon 10.3
day 10.2
bright 10
vehicle 9.9
freeway 9.8
group 9.7
landscape 9.7
sexy 9.6
sport 9.5
drive 9.5
motion 9.4
architecture 9.4
outdoors 9.2
truck 9.1
tourism 9.1
design 9
world 8.8
scene 8.7
sunny 8.6
line 8.6
walking 8.5
park 8.2
to 8
women 7.9
grass 7.9
men 7.7
automobile 7.7
two 7.6
direction 7.6
clouds 7.6
sign 7.5
ocean 7.5
black 7.2
activity 7.2
groom 7.2
sea 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 97.7
outdoor 93
vehicle 90.7
person 85.6
car 77.6
man 76.7
land vehicle 76.4
black and white 73.2
black 66.1
white 64.3
clothing 54.8

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Female, 85.5%
Calm 71.6%
Surprised 10.8%
Happy 8.6%
Angry 2.4%
Disgusted 2.4%
Fear 2.1%
Confused 1.1%
Sad 0.9%

AWS Rekognition

Age 45-53
Gender Male, 97.5%
Calm 71.5%
Surprised 11.6%
Sad 6.3%
Happy 3%
Angry 2.8%
Confused 2.5%
Disgusted 1.8%
Fear 0.5%

AWS Rekognition

Age 24-34
Gender Female, 62.9%
Calm 38.4%
Sad 36.9%
Disgusted 9.7%
Surprised 5%
Confused 4.9%
Angry 2.2%
Happy 2.2%
Fear 0.7%

AWS Rekognition

Age 26-36
Gender Male, 89.8%
Surprised 51.5%
Fear 22.6%
Sad 12.1%
Disgusted 8%
Confused 2.3%
Angry 2%
Calm 0.8%
Happy 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Car 98.4%
Wheel 90.6%

Captions

Microsoft

a group of people posing for a photo 87.9%
a group of people posing for a picture 87.8%
a group of people posing for the camera 87.7%

Text analysis

Google

MJI7--YT3RA°2--AG
MJI7--YT3RA°2--AG