Human Generated Data

Title

Untitled (overhead view of outdoor wedding reception with umbrellaed tables)

Date

1959

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9632

Human Generated Data

Title

Untitled (overhead view of outdoor wedding reception with umbrellaed tables)

People

Artist: Martin Schweig, American 20th century

Date

1959

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 97.1
Human 97.1
Water 96.6
Person 93.6
Person 89.7
Person 88.9
Waterfront 88.1
Person 83.8
Tent 83.3
Vehicle 81.9
Transportation 81.9
Dock 78.3
Pier 78.3
Port 78.3
Tent 78
Military 73.5
Tent 70.6
Person 69
Canopy 68.3
Tent 66.4
Person 65.4
Ship 63.3
Person 60.4
People 58.8
Person 57.8
Harbor 55.9
Person 54.7

Imagga
created on 2022-01-23

radio telescope 100
astronomical telescope 88.1
telescope 65.9
magnifier 43.8
sky 23
scientific instrument 22
travel 21.8
city 21.6
architecture 20.3
water 20
building 18.3
wreckage 15.7
ship 15.6
warship 14.3
aircraft carrier 14.2
sea 13.4
river 13.3
clouds 12.7
part 12.7
landscape 12.6
aerial 11.6
tower 11.6
vacation 11.4
technology 11.1
town 11.1
tourism 10.7
structure 10.5
cityscape 10.4
military vehicle 10.3
above 9.7
cloud 9.5
ocean 9.1
boat 9.1
old 9.1
world 8.9
urban 8.7
boats 8.7
scene 8.6
construction 8.6
aircraft 8.4
solar dish 8.4
house 8.4
exterior 8.3
air 8.3
street 8.3
earth 8.2
global 8.2
industrial 8.2
holiday 7.9
harbor 7.7
winter 7.7
roof 7.6
power 7.5
equipment 7.5
digital 7.3
business 7.3
vehicle 7.3
landmark 7.2
summer 7.1
work 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 96.6
black and white 93
water 81.2
monochrome 57.1
ship 51.2

Face analysis

Amazon

AWS Rekognition

Age 19-27
Gender Female, 72.6%
Sad 31.5%
Calm 30.6%
Happy 18.2%
Fear 5.4%
Disgusted 4.9%
Surprised 3.5%
Angry 3.4%
Confused 2.5%

AWS Rekognition

Age 27-37
Gender Female, 60.7%
Calm 97.7%
Happy 1%
Sad 0.3%
Angry 0.2%
Surprised 0.2%
Disgusted 0.2%
Confused 0.2%
Fear 0.2%

AWS Rekognition

Age 18-26
Gender Female, 54.8%
Calm 68.8%
Sad 19.7%
Happy 3.3%
Confused 2.5%
Angry 1.8%
Disgusted 1.4%
Surprised 1.4%
Fear 1.2%

AWS Rekognition

Age 24-34
Gender Male, 95%
Calm 80.2%
Sad 6.7%
Angry 5.3%
Surprised 2.8%
Confused 1.6%
Disgusted 1.6%
Happy 1.1%
Fear 0.7%

AWS Rekognition

Age 18-26
Gender Female, 74.8%
Calm 96.9%
Sad 1.1%
Angry 0.6%
Fear 0.5%
Happy 0.5%
Surprised 0.2%
Disgusted 0.2%
Confused 0.1%

AWS Rekognition

Age 30-40
Gender Male, 88.6%
Calm 53.8%
Happy 20%
Fear 9.7%
Surprised 8%
Angry 3.5%
Disgusted 1.8%
Sad 1.8%
Confused 1.4%

Feature analysis

Amazon

Person 97.1%
Tent 83.3%

Captions

Microsoft

calendar 10.3%

Text analysis

Amazon

25034