Human Generated Data

Title

Untitled (man and two young girls posed standing in potato field)

Date

c. 1949

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10825

Human Generated Data

Title

Untitled (man and two young girls posed standing in potato field)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

c. 1949

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Apparel 99.4
Shorts 99.4
Clothing 99.4
Human 95.9
Field 90.8
Nature 89.5
Person 86.9
People 83.7
Outdoors 83.1
Crowd 72.3
Standing 71.2
Countryside 64.3
Person 61.8

Imagga
created on 2022-01-29

negative 49.5
snow 43.5
film 39.2
landscape 38.7
tree 31.2
sky 27.9
weather 27.3
photographic paper 26.6
field 24.3
winter 23.8
trees 23.1
season 22.6
rural 21.1
grunge 20.4
cold 19.8
forest 19.1
scene 19
photographic equipment 17.7
scenic 17.6
outdoor 16.8
old 16.7
horizon 16.2
land 16
texture 16
park bench 15.9
cloud 15.5
vintage 14.9
countryside 14.6
frame 14.2
country 14.1
pattern 13.7
outdoors 13.6
bench 13.6
scenery 13.5
fog 13.5
wood 13.3
space 13.2
antique 13
sun 13
natural 12.7
snowy 12.6
farm 12.5
tool 12.5
frost 12.5
park 12.4
grungy 12.3
sunrise 12.2
handsaw 12.1
ice 12
damaged 11.4
black 11.4
agriculture 11.4
paint 10.9
border 10.9
foggy 10.8
autumn 10.5
seasonal 10.5
summer 10.3
grass 10.3
graphic 10.2
spring 10.2
clouds 10.1
rough 10
light 10
travel 9.9
art 9.8
covered 9.7
edge 9.6
saw 9.6
hedge trimmer 9.6
frozen 9.6
sprinkler 9.5
paper 9.4
seat 9.4
peaceful 9.2
environment 9
dirty 9
retro 9
design 9
material 8.9
mountain 8.9
frosty 8.8
freeze 8.7
weathered 8.6
dark 8.4
meadow 8.1
garden tool 8
river 8
plant 8
mechanical device 7.9
text 7.9
structure 7.8
scenics 7.6
cloudscape 7.6
water 7.3
branch 7.3
track 7.3
tranquil 7.2
hand tool 7.2
road 7.2
sunset 7.2
bright 7.2
textured 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

outdoor 95.6
beach 91.4
text 91.1
black and white 85.2
landscape 84.6
cloud 80.9
sky 76
white 69.7
water 64.9
nature 50.3
picture frame 8.6
crowd 1.8

Face analysis

Amazon

AWS Rekognition

Age 18-24
Gender Male, 94.4%
Calm 65.5%
Happy 29.4%
Surprised 1.8%
Confused 1.3%
Disgusted 0.7%
Angry 0.5%
Fear 0.4%
Sad 0.3%

AWS Rekognition

Age 29-39
Gender Female, 84.7%
Happy 57.3%
Confused 11.3%
Surprised 9.9%
Disgusted 8.3%
Fear 5.7%
Calm 2.9%
Angry 2.6%
Sad 2%

AWS Rekognition

Age 25-35
Gender Male, 99.9%
Surprised 58.9%
Calm 12.8%
Happy 11.9%
Fear 10.9%
Confused 2.5%
Disgusted 1.5%
Angry 0.9%
Sad 0.6%

Feature analysis

Amazon

Person 86.9%

Captions

Microsoft

a person standing in front of a field 75.6%
a person standing in front of a field 69.5%
a boy standing in front of a field 39.8%

Text analysis

Amazon

RODVRCVLELIA