Human Generated Data

Title

Untitled (bride and older woman approaching church from street)

Date

1949

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6178

Human Generated Data

Title

Untitled (bride and older woman approaching church from street)

People

Artist: Martin Schweig, American 20th century

Date

1949

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.3
Human 99.3
Car 98.5
Transportation 98.5
Vehicle 98.5
Automobile 98.5
Person 97.5
Person 96.1
Sport 86.2
Sports 86.2
Tennis Court 86.2
Car 84.8
Tarmac 79.1
Asphalt 79.1
Car 74.9
People 68.7
Person 68.6
Car 58.4
Wheel 57.7
Machine 57.7
Pedestrian 56.9
Shorts 55.4
Clothing 55.4
Apparel 55.4

Imagga
created on 2022-01-23

cleaning implement 32.4
road 28.9
intersection 26.2
street 25.8
travel 25.3
urban 24.5
broom 24.2
city 21.6
transportation 19.7
concrete 17.2
traffic 17.1
crutch 16.9
swab 16.1
asphalt 15.6
line 15.6
speed 14.7
stick 14.3
lane 13.7
transport 13.7
staff 13.5
highway 13.5
track 13.5
airport 13
outdoor 12.2
sign 12
building 12
pavement 11.8
car 11.5
aircraft carrier 11.3
architecture 10.9
sky 10.8
vehicle 10.8
man 10.7
ship 10.6
direction 10.5
landscape 10.4
business 10.3
water 10
tourism 9.9
warship 9.4
old 9.1
vacation 9
sport 8.9
empty 8.6
walk 8.6
construction 8.6
way 8.5
walking 8.5
drive 8.5
dark 8.4
tourist 8.2
paint 8.1
parking 7.9
day 7.8
sea 7.8
space 7.8
summer 7.7
motion 7.7
park 7.4
safety 7.4
black 7.2
work 7.1
military vehicle 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 98.4
black and white 97.4
monochrome 79.3
white 60.7

Face analysis

Amazon

AWS Rekognition

Age 27-37
Gender Male, 99.8%
Calm 98.2%
Confused 0.5%
Fear 0.4%
Happy 0.3%
Surprised 0.2%
Disgusted 0.2%
Sad 0.1%
Angry 0.1%

AWS Rekognition

Age 6-16
Gender Male, 93.3%
Fear 30%
Sad 29.6%
Calm 23.5%
Happy 7.1%
Angry 3.6%
Disgusted 2.7%
Surprised 2.5%
Confused 1.1%

AWS Rekognition

Age 18-26
Gender Female, 68.8%
Happy 55.2%
Calm 22.7%
Sad 18.3%
Angry 1.4%
Fear 1%
Confused 0.5%
Disgusted 0.5%
Surprised 0.4%

Feature analysis

Amazon

Person 99.3%
Car 98.5%
Wheel 57.7%

Captions

Microsoft

a group of people standing next to a window 59%
a group of people standing in front of a window 57.7%
a group of people in front of a window 57.3%

Text analysis

Amazon

KODAK-
KODAK- SALETA
S.
SALETA

Google

YT37A2-YAGOX
YT37A2-YAGOX