Human Generated Data

Title

Dancing Shepherdess

Date

1784

People

Artist: Wilson Lowry, British 1762 - 1824

Artist after: Claude Lorrain, French 1604 - 1682

Publisher: John Boydell, British 1719 - 1804

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R12618

Human Generated Data

Title

Dancing Shepherdess

People

Artist: Wilson Lowry, British 1762 - 1824

Artist after: Claude Lorrain, French 1604 - 1682

Publisher: John Boydell, British 1719 - 1804

Date

1784

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2022-01-24

Person 99.4
Human 99.4
Person 98.9
Person 98.1
Person 98
Person 97.2
Art 97.1
Painting 97.1
Person 96.4
Person 96.1
Person 92.1
Person 87
Person 80.5
Person 79.5
Drawing 58
Person 53.3

Imagga
created on 2022-01-24

landscape 48.4
sky 46.7
snow 46
tree 45
weather 33.2
trees 30.3
winter 27.3
water 25.4
scenic 24.6
clouds 24.6
forest 24
scenery 23.5
season 21.9
cold 21.6
park 21.5
field 21
outdoors 19
river 18.7
shore 18.7
beach 18.5
scene 18.2
lake 17.9
rural 17.7
travel 17.6
country 17.6
countryside 17.4
cloud 17.3
land 16.7
sun 16.5
horizon 16.2
lakeside 16.1
sunset 15.3
fog 14.5
sunrise 14.1
atmosphere 14
shoreline 13.9
morning 13.6
vacation 13.1
peaceful 12.8
mountain 12.8
light 12.7
snowy 12.7
wood 12.5
frost 12.5
cloudy 12.2
grass 11.9
freeze 11.7
woods 11.5
seasonal 11.4
natural 11.4
sea 11
fall 10.9
ocean 10.8
outdoor 10.7
geological formation 10.7
summer 10.3
evening 10.3
mountains 10.2
branch 10.1
coast 9.9
environment 9.9
barrier 9.7
autumn 9.7
frozen 9.6
bright 9.3
dark 9.2
calm 9.2
meadow 9
foggy 8.9
swamp 8.8
sunny 8.6
holiday 8.6
basin 8.6
ice 8.6
old 8.4
silhouette 8.3
range 8.3
island 8.3
sandbar 8.1
recreation 8.1
wilderness 8.1
day 7.9
empty 7.7
lonely 7.7
solitude 7.7
pond 7.7
structure 7.6
tourism 7.4
pine 7.4
alone 7.3
national 7.3
tranquil 7.3
road 7.2
natural depression 7.2
sand 7.1

Google
created on 2022-01-24

Microsoft
created on 2022-01-24

tree 94.3
text 90.5
outdoor 89.5
old 78.1
drawing 55.5
vintage 27.3
painting 16.1
picture frame 6.2

Face analysis

Amazon

AWS Rekognition

Age 22-30
Gender Female, 53.1%
Calm 92.4%
Surprised 2.9%
Sad 1.5%
Happy 1%
Confused 0.8%
Disgusted 0.6%
Fear 0.4%
Angry 0.4%

AWS Rekognition

Age 23-33
Gender Female, 99.7%
Calm 87.1%
Sad 10.3%
Fear 1.2%
Happy 0.4%
Surprised 0.3%
Angry 0.3%
Confused 0.2%
Disgusted 0.2%

AWS Rekognition

Age 14-22
Gender Female, 78.1%
Disgusted 81.4%
Angry 10%
Fear 3.6%
Surprised 2%
Calm 1.3%
Happy 0.6%
Sad 0.6%
Confused 0.5%

AWS Rekognition

Age 18-26
Gender Female, 92.6%
Calm 83.4%
Sad 4.4%
Surprised 3.2%
Fear 2.3%
Happy 2%
Angry 1.8%
Confused 1.7%
Disgusted 1.2%

AWS Rekognition

Age 20-28
Gender Female, 75.3%
Calm 64.4%
Sad 31%
Happy 1.8%
Disgusted 1%
Fear 0.8%
Angry 0.4%
Surprised 0.2%
Confused 0.2%

AWS Rekognition

Age 7-17
Gender Female, 52.7%
Calm 77.4%
Fear 5.7%
Confused 5.3%
Sad 4.9%
Angry 4%
Surprised 1.4%
Disgusted 0.8%
Happy 0.5%

Feature analysis

Amazon

Person 99.4%
Painting 97.1%

Captions

Microsoft

a vintage photo of a horse 76.6%
a vintage photo of a person 67.8%
an old photo of a horse 67.7%

Text analysis

Amazon

-
- للضوم
4kg
للضوم